16142 1727204100.10263: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 16142 1727204100.10702: Added group all to inventory 16142 1727204100.10704: Added group ungrouped to inventory 16142 1727204100.10709: Group all now contains ungrouped 16142 1727204100.10712: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 16142 1727204100.31938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 16142 1727204100.32010: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 16142 1727204100.32038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 16142 1727204100.32103: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 16142 1727204100.32187: Loaded config def from plugin (inventory/script) 16142 1727204100.32190: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 16142 1727204100.32235: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 16142 1727204100.32337: Loaded config def from plugin (inventory/yaml) 16142 1727204100.32339: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 16142 1727204100.32438: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 16142 1727204100.32902: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 16142 1727204100.32906: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 16142 1727204100.32909: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 16142 1727204100.32915: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 16142 1727204100.32920: Loading data from /tmp/network-M6W/inventory-5vW.yml 16142 1727204100.32995: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 16142 1727204100.33068: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 16142 1727204100.33110: Loading data from /tmp/network-M6W/inventory-5vW.yml 16142 1727204100.33376: group all already in inventory 16142 1727204100.33383: set inventory_file for managed-node1 16142 1727204100.33388: set inventory_dir for managed-node1 16142 1727204100.33389: Added host managed-node1 to inventory 16142 1727204100.33391: Added host managed-node1 to group all 16142 1727204100.33392: set ansible_host for managed-node1 16142 1727204100.33393: set ansible_ssh_extra_args for managed-node1 16142 1727204100.33397: set inventory_file for managed-node2 16142 1727204100.33400: set inventory_dir for managed-node2 16142 1727204100.33401: Added host managed-node2 to inventory 16142 1727204100.33402: Added host managed-node2 to group all 16142 1727204100.33403: set ansible_host for managed-node2 16142 1727204100.33404: set ansible_ssh_extra_args for managed-node2 16142 1727204100.33407: set inventory_file for managed-node3 16142 1727204100.33409: set inventory_dir for managed-node3 16142 1727204100.33410: Added host managed-node3 to inventory 16142 1727204100.33411: Added host managed-node3 to group all 16142 1727204100.33412: set ansible_host for managed-node3 16142 1727204100.33413: set ansible_ssh_extra_args for managed-node3 16142 1727204100.33416: Reconcile groups and hosts in inventory. 16142 1727204100.33420: Group ungrouped now contains managed-node1 16142 1727204100.33422: Group ungrouped now contains managed-node2 16142 1727204100.33423: Group ungrouped now contains managed-node3 16142 1727204100.33507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 16142 1727204100.33642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 16142 1727204100.33694: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 16142 1727204100.33722: Loaded config def from plugin (vars/host_group_vars) 16142 1727204100.33725: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 16142 1727204100.33735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 16142 1727204100.33744: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 16142 1727204100.33790: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 16142 1727204100.34157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204100.34255: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 16142 1727204100.34297: Loaded config def from plugin (connection/local) 16142 1727204100.34300: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 16142 1727204100.35000: Loaded config def from plugin (connection/paramiko_ssh) 16142 1727204100.35003: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 16142 1727204100.35997: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 16142 1727204100.36041: Loaded config def from plugin (connection/psrp) 16142 1727204100.36044: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 16142 1727204100.36936: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 16142 1727204100.36980: Loaded config def from plugin (connection/ssh) 16142 1727204100.36983: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 16142 1727204100.37704: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 16142 1727204100.37740: Loaded config def from plugin (connection/winrm) 16142 1727204100.37743: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 16142 1727204100.37772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 16142 1727204100.37837: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 16142 1727204100.37901: Loaded config def from plugin (shell/cmd) 16142 1727204100.37903: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 16142 1727204100.37932: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 16142 1727204100.37998: Loaded config def from plugin (shell/powershell) 16142 1727204100.38000: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 16142 1727204100.38051: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 16142 1727204100.38236: Loaded config def from plugin (shell/sh) 16142 1727204100.38238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 16142 1727204100.38275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 16142 1727204100.38408: Loaded config def from plugin (become/runas) 16142 1727204100.38412: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 16142 1727204100.38647: Loaded config def from plugin (become/su) 16142 1727204100.38650: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 16142 1727204100.38819: Loaded config def from plugin (become/sudo) 16142 1727204100.38822: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 16142 1727204100.38861: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 16142 1727204100.39272: in VariableManager get_vars() 16142 1727204100.39294: done with get_vars() 16142 1727204100.39442: trying /usr/local/lib/python3.12/site-packages/ansible/modules 16142 1727204100.44433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 16142 1727204100.44567: in VariableManager get_vars() 16142 1727204100.44573: done with get_vars() 16142 1727204100.44576: variable 'playbook_dir' from source: magic vars 16142 1727204100.44577: variable 'ansible_playbook_python' from source: magic vars 16142 1727204100.44578: variable 'ansible_config_file' from source: magic vars 16142 1727204100.44579: variable 'groups' from source: magic vars 16142 1727204100.44580: variable 'omit' from source: magic vars 16142 1727204100.44580: variable 'ansible_version' from source: magic vars 16142 1727204100.44581: variable 'ansible_check_mode' from source: magic vars 16142 1727204100.44582: variable 'ansible_diff_mode' from source: magic vars 16142 1727204100.44582: variable 'ansible_forks' from source: magic vars 16142 1727204100.44583: variable 'ansible_inventory_sources' from source: magic vars 16142 1727204100.44584: variable 'ansible_skip_tags' from source: magic vars 16142 1727204100.44584: variable 'ansible_limit' from source: magic vars 16142 1727204100.44585: variable 'ansible_run_tags' from source: magic vars 16142 1727204100.44586: variable 'ansible_verbosity' from source: magic vars 16142 1727204100.44619: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 16142 1727204100.46060: in VariableManager get_vars() 16142 1727204100.46080: done with get_vars() 16142 1727204100.46091: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 16142 1727204100.47149: in VariableManager get_vars() 16142 1727204100.47166: done with get_vars() 16142 1727204100.47214: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 16142 1727204100.47324: in VariableManager get_vars() 16142 1727204100.47343: done with get_vars() 16142 1727204100.47519: in VariableManager get_vars() 16142 1727204100.47536: done with get_vars() 16142 1727204100.47545: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 16142 1727204100.47639: in VariableManager get_vars() 16142 1727204100.47654: done with get_vars() 16142 1727204100.48877: in VariableManager get_vars() 16142 1727204100.48894: done with get_vars() 16142 1727204100.48899: variable 'omit' from source: magic vars 16142 1727204100.48919: variable 'omit' from source: magic vars 16142 1727204100.48958: in VariableManager get_vars() 16142 1727204100.48985: done with get_vars() 16142 1727204100.50046: in VariableManager get_vars() 16142 1727204100.50063: done with get_vars() 16142 1727204100.50107: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 16142 1727204100.51053: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 16142 1727204100.51869: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 16142 1727204100.54178: in VariableManager get_vars() 16142 1727204100.54203: done with get_vars() 16142 1727204100.55521: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 16142 1727204100.55698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204100.58939: in VariableManager get_vars() 16142 1727204100.58962: done with get_vars() 16142 1727204100.58976: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 16142 1727204100.59075: in VariableManager get_vars() 16142 1727204100.59095: done with get_vars() 16142 1727204100.59229: in VariableManager get_vars() 16142 1727204100.59249: done with get_vars() 16142 1727204100.59524: in VariableManager get_vars() 16142 1727204100.59543: done with get_vars() 16142 1727204100.59548: variable 'omit' from source: magic vars 16142 1727204100.59558: variable 'omit' from source: magic vars 16142 1727204100.59719: variable 'controller_profile' from source: play vars 16142 1727204100.59766: in VariableManager get_vars() 16142 1727204100.59779: done with get_vars() 16142 1727204100.59798: in VariableManager get_vars() 16142 1727204100.59812: done with get_vars() 16142 1727204100.59844: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 16142 1727204100.60000: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 16142 1727204100.60086: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 16142 1727204100.60490: in VariableManager get_vars() 16142 1727204100.60512: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204100.63457: in VariableManager get_vars() 16142 1727204100.63484: done with get_vars() 16142 1727204100.63489: variable 'omit' from source: magic vars 16142 1727204100.63501: variable 'omit' from source: magic vars 16142 1727204100.63535: in VariableManager get_vars() 16142 1727204100.63553: done with get_vars() 16142 1727204100.63583: in VariableManager get_vars() 16142 1727204100.63602: done with get_vars() 16142 1727204100.63635: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 16142 1727204100.63753: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 16142 1727204100.63943: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 16142 1727204100.66706: in VariableManager get_vars() 16142 1727204100.66735: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204100.69123: in VariableManager get_vars() 16142 1727204100.69151: done with get_vars() 16142 1727204100.69157: variable 'omit' from source: magic vars 16142 1727204100.69170: variable 'omit' from source: magic vars 16142 1727204100.69200: in VariableManager get_vars() 16142 1727204100.69236: done with get_vars() 16142 1727204100.69257: in VariableManager get_vars() 16142 1727204100.69282: done with get_vars() 16142 1727204100.69401: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 16142 1727204100.69530: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 16142 1727204100.69617: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 16142 1727204100.70051: in VariableManager get_vars() 16142 1727204100.70154: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204100.73294: in VariableManager get_vars() 16142 1727204100.73322: done with get_vars() 16142 1727204100.73326: variable 'omit' from source: magic vars 16142 1727204100.73453: variable 'omit' from source: magic vars 16142 1727204100.73504: in VariableManager get_vars() 16142 1727204100.73529: done with get_vars() 16142 1727204100.73550: in VariableManager get_vars() 16142 1727204100.73580: done with get_vars() 16142 1727204100.73613: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 16142 1727204100.73761: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 16142 1727204100.73848: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 16142 1727204100.74335: in VariableManager get_vars() 16142 1727204100.74367: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204100.78349: in VariableManager get_vars() 16142 1727204100.78390: done with get_vars() 16142 1727204100.78402: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 16142 1727204100.78970: in VariableManager get_vars() 16142 1727204100.79000: done with get_vars() 16142 1727204100.79206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 16142 1727204100.79222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 16142 1727204100.79552: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 16142 1727204100.79745: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 16142 1727204100.79748: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 16142 1727204100.79787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 16142 1727204100.79819: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 16142 1727204100.80011: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 16142 1727204100.80079: Loaded config def from plugin (callback/default) 16142 1727204100.80082: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 16142 1727204100.81933: Loaded config def from plugin (callback/junit) 16142 1727204100.81936: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 16142 1727204100.81992: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 16142 1727204100.82070: Loaded config def from plugin (callback/minimal) 16142 1727204100.82072: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 16142 1727204100.82119: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 16142 1727204100.82257: Loaded config def from plugin (callback/tree) 16142 1727204100.82260: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 16142 1727204100.82393: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 16142 1727204100.82396: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_nm.yml ******************************************** 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 16142 1727204100.82429: in VariableManager get_vars() 16142 1727204100.82444: done with get_vars() 16142 1727204100.82450: in VariableManager get_vars() 16142 1727204100.82458: done with get_vars() 16142 1727204100.82473: variable 'omit' from source: magic vars 16142 1727204100.82521: in VariableManager get_vars() 16142 1727204100.82535: done with get_vars() 16142 1727204100.82557: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with nm as provider] ***** 16142 1727204100.83311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 16142 1727204100.83496: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 16142 1727204100.84681: getting the remaining hosts for this loop 16142 1727204100.84684: done getting the remaining hosts for this loop 16142 1727204100.84688: getting the next task for host managed-node2 16142 1727204100.84693: done getting next task for host managed-node2 16142 1727204100.84695: ^ task is: TASK: Gathering Facts 16142 1727204100.84702: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204100.84705: getting variables 16142 1727204100.84706: in VariableManager get_vars() 16142 1727204100.84722: Calling all_inventory to load vars for managed-node2 16142 1727204100.84730: Calling groups_inventory to load vars for managed-node2 16142 1727204100.85292: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204100.85310: Calling all_plugins_play to load vars for managed-node2 16142 1727204100.85322: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204100.85326: Calling groups_plugins_play to load vars for managed-node2 16142 1727204100.85367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204100.85544: done with get_vars() 16142 1727204100.85554: done getting variables 16142 1727204100.85787: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 Tuesday 24 September 2024 14:55:00 -0400 (0:00:00.034) 0:00:00.034 ***** 16142 1727204100.85814: entering _queue_task() for managed-node2/gather_facts 16142 1727204100.85815: Creating lock for gather_facts 16142 1727204100.86440: worker is 1 (out of 1 available) 16142 1727204100.86452: exiting _queue_task() for managed-node2/gather_facts 16142 1727204100.86468: done queuing things up, now waiting for results queue to drain 16142 1727204100.86471: waiting for pending results... 16142 1727204100.87889: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16142 1727204100.88008: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001bc 16142 1727204100.88031: variable 'ansible_search_path' from source: unknown 16142 1727204100.88088: calling self._execute() 16142 1727204100.88160: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204100.88182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204100.88198: variable 'omit' from source: magic vars 16142 1727204100.88315: variable 'omit' from source: magic vars 16142 1727204100.88348: variable 'omit' from source: magic vars 16142 1727204100.88403: variable 'omit' from source: magic vars 16142 1727204100.88455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204100.88554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204100.88583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204100.88606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204100.88622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204100.88667: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204100.88677: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204100.88686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204100.88803: Set connection var ansible_timeout to 10 16142 1727204100.88811: Set connection var ansible_connection to ssh 16142 1727204100.88820: Set connection var ansible_shell_type to sh 16142 1727204100.88828: Set connection var ansible_shell_executable to /bin/sh 16142 1727204100.88836: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204100.88846: Set connection var ansible_pipelining to False 16142 1727204100.88881: variable 'ansible_shell_executable' from source: unknown 16142 1727204100.88984: variable 'ansible_connection' from source: unknown 16142 1727204100.88993: variable 'ansible_module_compression' from source: unknown 16142 1727204100.89000: variable 'ansible_shell_type' from source: unknown 16142 1727204100.89007: variable 'ansible_shell_executable' from source: unknown 16142 1727204100.89014: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204100.89021: variable 'ansible_pipelining' from source: unknown 16142 1727204100.89027: variable 'ansible_timeout' from source: unknown 16142 1727204100.89033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204100.90102: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204100.90119: variable 'omit' from source: magic vars 16142 1727204100.90128: starting attempt loop 16142 1727204100.90135: running the handler 16142 1727204100.90155: variable 'ansible_facts' from source: unknown 16142 1727204100.90222: _low_level_execute_command(): starting 16142 1727204100.90236: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204100.91681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204100.91699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204100.91715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204100.91744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204100.91790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204100.91804: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204100.91819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204100.91843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204100.91861: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204100.91875: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204100.91889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204100.91903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204100.91920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204100.91931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204100.91943: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204100.91966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204100.92043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204100.92064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204100.92085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204100.92362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204100.93892: stdout chunk (state=3): >>>/root <<< 16142 1727204100.94073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204100.94098: stdout chunk (state=3): >>><<< 16142 1727204100.94101: stderr chunk (state=3): >>><<< 16142 1727204100.94273: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204100.94276: _low_level_execute_command(): starting 16142 1727204100.94279: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141 `" && echo ansible-tmp-1727204100.9412155-16348-101211323707141="` echo /root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141 `" ) && sleep 0' 16142 1727204100.95667: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204100.95719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204100.95723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204100.95767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204100.95770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204100.95774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204100.95776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204100.96103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204100.96128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204100.96152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204100.96240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204100.98105: stdout chunk (state=3): >>>ansible-tmp-1727204100.9412155-16348-101211323707141=/root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141 <<< 16142 1727204100.98286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204100.98335: stderr chunk (state=3): >>><<< 16142 1727204100.98339: stdout chunk (state=3): >>><<< 16142 1727204100.98371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204100.9412155-16348-101211323707141=/root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204100.98472: variable 'ansible_module_compression' from source: unknown 16142 1727204100.98475: ANSIBALLZ: Using generic lock for ansible.legacy.setup 16142 1727204100.98581: ANSIBALLZ: Acquiring lock 16142 1727204100.98585: ANSIBALLZ: Lock acquired: 140089297016096 16142 1727204100.98587: ANSIBALLZ: Creating module 16142 1727204101.63953: ANSIBALLZ: Writing module into payload 16142 1727204101.64141: ANSIBALLZ: Writing module 16142 1727204101.64180: ANSIBALLZ: Renaming module 16142 1727204101.64190: ANSIBALLZ: Done creating module 16142 1727204101.64213: variable 'ansible_facts' from source: unknown 16142 1727204101.64225: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204101.64258: _low_level_execute_command(): starting 16142 1727204101.64271: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 16142 1727204101.66197: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204101.66214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204101.66229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204101.66255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204101.66304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204101.66315: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204101.66329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204101.66349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204101.66365: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204101.66384: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204101.66398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204101.66407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204101.66424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204101.66441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204101.66452: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204101.66470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204101.66548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204101.66582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204101.66599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204101.66772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204101.68449: stdout chunk (state=3): >>>PLATFORM <<< 16142 1727204101.68538: stdout chunk (state=3): >>>Linux <<< 16142 1727204101.68556: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 16142 1727204101.68787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204101.68791: stdout chunk (state=3): >>><<< 16142 1727204101.68793: stderr chunk (state=3): >>><<< 16142 1727204101.68870: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204101.68881 [managed-node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 16142 1727204101.68960: _low_level_execute_command(): starting 16142 1727204101.68965: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 16142 1727204101.69309: Sending initial data 16142 1727204101.69313: Sent initial data (1181 bytes) 16142 1727204101.70440: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204101.70483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204101.70499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204101.70518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204101.70569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204101.70680: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204101.70699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204101.70717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204101.70730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204101.70742: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204101.70755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204101.70771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204101.70789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204101.70807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204101.70821: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204101.70836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204101.71041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204101.71058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204101.71077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204101.71242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204101.75084: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 16142 1727204101.75490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204101.75673: stderr chunk (state=3): >>><<< 16142 1727204101.75677: stdout chunk (state=3): >>><<< 16142 1727204101.75679: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204101.75781: variable 'ansible_facts' from source: unknown 16142 1727204101.75784: variable 'ansible_facts' from source: unknown 16142 1727204101.75787: variable 'ansible_module_compression' from source: unknown 16142 1727204101.75789: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16142 1727204101.75890: variable 'ansible_facts' from source: unknown 16142 1727204101.75940: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141/AnsiballZ_setup.py 16142 1727204101.76626: Sending initial data 16142 1727204101.76629: Sent initial data (154 bytes) 16142 1727204101.79080: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204101.79174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204101.79190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204101.79210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204101.79254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204101.79273: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204101.79289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204101.79308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204101.79320: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204101.79331: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204101.79345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204101.79359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204101.79391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204101.79405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204101.79418: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204101.79432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204101.79658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204101.79784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204101.79800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204101.79933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204101.81676: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204101.81709: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204101.81750: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpbb3s6602 /root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141/AnsiballZ_setup.py <<< 16142 1727204101.81785: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204101.84790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204101.84976: stderr chunk (state=3): >>><<< 16142 1727204101.84980: stdout chunk (state=3): >>><<< 16142 1727204101.84982: done transferring module to remote 16142 1727204101.84984: _low_level_execute_command(): starting 16142 1727204101.84987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141/ /root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141/AnsiballZ_setup.py && sleep 0' 16142 1727204101.86601: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204101.86605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204101.86771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204101.86775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204101.86777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204101.86780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204101.86957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204101.86961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204101.87021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204101.88901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204101.88983: stderr chunk (state=3): >>><<< 16142 1727204101.88986: stdout chunk (state=3): >>><<< 16142 1727204101.89074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204101.89077: _low_level_execute_command(): starting 16142 1727204101.89080: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141/AnsiballZ_setup.py && sleep 0' 16142 1727204101.91656: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204101.91661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204101.91699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204101.91702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204101.91705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204101.91762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204101.92101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204101.92104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204101.92178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204101.94314: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 16142 1727204101.94318: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 16142 1727204101.94371: stdout chunk (state=3): >>>import '_io' # <<< 16142 1727204101.94386: stdout chunk (state=3): >>>import 'marshal' # <<< 16142 1727204101.94413: stdout chunk (state=3): >>>import 'posix' # <<< 16142 1727204101.94453: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 16142 1727204101.94457: stdout chunk (state=3): >>># installing zipimport hook <<< 16142 1727204101.94502: stdout chunk (state=3): >>>import 'time' # <<< 16142 1727204101.94505: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 16142 1727204101.94557: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204101.94578: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 16142 1727204101.94601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 16142 1727204101.94604: stdout chunk (state=3): >>>import '_codecs' # <<< 16142 1727204101.94634: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709adb3dc0> <<< 16142 1727204101.94666: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 16142 1727204101.94779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709adb3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709adb3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 16142 1727204101.94823: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad58490> <<< 16142 1727204101.94855: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 16142 1727204101.94872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 16142 1727204101.94901: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 16142 1727204101.94922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 16142 1727204101.94950: stdout chunk (state=3): >>>import '_abc' # <<< 16142 1727204101.94981: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad58940><<< 16142 1727204101.94984: stdout chunk (state=3): >>> <<< 16142 1727204101.95011: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad58670> <<< 16142 1727204101.95105: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 16142 1727204101.95124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 16142 1727204101.95187: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 16142 1727204101.95222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 16142 1727204101.95303: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 16142 1727204101.95383: stdout chunk (state=3): >>>import '_stat' # <<< 16142 1727204101.95386: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad0f190> <<< 16142 1727204101.95406: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 16142 1727204101.95435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 16142 1727204101.95505: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad0f220> <<< 16142 1727204101.95539: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 16142 1727204101.95542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 16142 1727204101.95575: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 16142 1727204101.95578: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad32850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad0f940> <<< 16142 1727204101.95616: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad70880> <<< 16142 1727204101.95641: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 16142 1727204101.95644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 16142 1727204101.95647: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad08d90> <<< 16142 1727204101.95697: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 16142 1727204101.95714: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad32d90> <<< 16142 1727204101.95781: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad58970> <<< 16142 1727204101.95808: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 16142 1727204101.96150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 16142 1727204101.96171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 16142 1727204101.96198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 16142 1727204101.96205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 16142 1727204101.96220: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 16142 1727204101.96236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 16142 1727204101.96259: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 16142 1727204101.96274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 16142 1727204101.96288: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acadf10> <<< 16142 1727204101.96340: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acb40a0> <<< 16142 1727204101.96356: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 16142 1727204101.96370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 16142 1727204101.96392: stdout chunk (state=3): >>>import '_sre' # <<< 16142 1727204101.96412: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 16142 1727204101.96433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 16142 1727204101.96456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 16142 1727204101.96459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 16142 1727204101.96482: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709aca75b0> <<< 16142 1727204101.96501: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acae6a0> <<< 16142 1727204101.96504: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acad3d0> <<< 16142 1727204101.96538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 16142 1727204101.96618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 16142 1727204101.96645: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 16142 1727204101.96678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204101.96697: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 16142 1727204101.96742: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709ac31e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac31910> <<< 16142 1727204101.96759: stdout chunk (state=3): >>>import 'itertools' # <<< 16142 1727204101.96790: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 16142 1727204101.96793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac31f10> <<< 16142 1727204101.96815: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 16142 1727204101.96827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 16142 1727204101.96855: stdout chunk (state=3): >>>import '_operator' # <<< 16142 1727204101.96858: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac31fd0> <<< 16142 1727204101.96883: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 16142 1727204101.96888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac420d0> <<< 16142 1727204101.96905: stdout chunk (state=3): >>>import '_collections' # <<< 16142 1727204101.96960: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac89d90> <<< 16142 1727204101.96963: stdout chunk (state=3): >>>import '_functools' # <<< 16142 1727204101.96987: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac82670> <<< 16142 1727204101.97055: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 16142 1727204101.97058: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acb5e80> <<< 16142 1727204101.97088: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 16142 1727204101.97121: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709ac42cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac892b0> <<< 16142 1727204101.97169: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204101.97180: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709ac952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acbba30> <<< 16142 1727204101.97203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 16142 1727204101.97239: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204101.97257: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 16142 1727204101.97272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 16142 1727204101.97293: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42df0> <<< 16142 1727204101.97323: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 16142 1727204101.97335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42d60> <<< 16142 1727204101.97348: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 16142 1727204101.97362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 16142 1727204101.97384: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 16142 1727204101.97396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 16142 1727204101.97417: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 16142 1727204101.97472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 16142 1727204101.97494: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py <<< 16142 1727204101.97507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a9723d0> <<< 16142 1727204101.97531: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 16142 1727204101.97542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 16142 1727204101.97572: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a9724c0> <<< 16142 1727204101.97694: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac4af40> <<< 16142 1727204101.97735: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac44a90> <<< 16142 1727204101.97750: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac44490> <<< 16142 1727204101.97767: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 16142 1727204101.97781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 16142 1727204101.97823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 16142 1727204101.97835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 16142 1727204101.97854: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 16142 1727204101.97869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8a6220> <<< 16142 1727204101.97905: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a95d520> <<< 16142 1727204101.97953: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac44f10> <<< 16142 1727204101.97971: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acbb0a0> <<< 16142 1727204101.97983: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 16142 1727204101.98008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 16142 1727204101.98030: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 16142 1727204101.98043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8b8b50> <<< 16142 1727204101.98058: stdout chunk (state=3): >>>import 'errno' # <<< 16142 1727204101.98096: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8b8e80> <<< 16142 1727204101.98189: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8c9790> <<< 16142 1727204101.98200: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 16142 1727204101.98298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8c9cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a857400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8b8f70> <<< 16142 1727204101.98329: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 16142 1727204101.98341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 16142 1727204101.98381: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8682e0> <<< 16142 1727204101.98395: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8c9610> <<< 16142 1727204101.98410: stdout chunk (state=3): >>>import 'pwd' # <<< 16142 1727204101.98433: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8683a0> <<< 16142 1727204101.98476: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42a30> <<< 16142 1727204101.98488: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 16142 1727204101.98515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 16142 1727204101.98627: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a884700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 16142 1727204101.98643: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8849d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8847c0> <<< 16142 1727204101.98679: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8848b0> <<< 16142 1727204101.98706: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 16142 1727204101.98724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 16142 1727204101.98913: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204101.98925: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a884d00> <<< 16142 1727204101.98956: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204101.98971: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a88f250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a884940> <<< 16142 1727204101.98980: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a877a90> <<< 16142 1727204101.99005: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42610> <<< 16142 1727204101.99026: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 16142 1727204101.99087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 16142 1727204101.99122: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a884af0> <<< 16142 1727204101.99277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 16142 1727204101.99292: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f709a7ad6d0> <<< 16142 1727204101.99544: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 16142 1727204101.99639: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204101.99674: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 16142 1727204101.99678: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204101.99692: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204101.99709: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 16142 1727204101.99727: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.00990: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.01898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ae820> <<< 16142 1727204102.01918: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204102.01937: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 16142 1727204102.01962: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 16142 1727204102.01993: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1ae160> <<< 16142 1727204102.02035: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ae280> <<< 16142 1727204102.02066: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1aef70> <<< 16142 1727204102.02086: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 16142 1727204102.02139: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ae4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1aed90> <<< 16142 1727204102.02142: stdout chunk (state=3): >>>import 'atexit' # <<< 16142 1727204102.02189: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.02193: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1aefd0> <<< 16142 1727204102.02195: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 16142 1727204102.02221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 16142 1727204102.02256: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ae100> <<< 16142 1727204102.02291: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 16142 1727204102.02296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 16142 1727204102.02312: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 16142 1727204102.02332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 16142 1727204102.02357: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 16142 1727204102.02446: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1830d0> <<< 16142 1727204102.02487: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.02490: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a088340> <<< 16142 1727204102.02526: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.02532: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a088040> <<< 16142 1727204102.02538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 16142 1727204102.02550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 16142 1727204102.02582: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a088ca0> <<< 16142 1727204102.02594: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a196dc0> <<< 16142 1727204102.02756: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1963a0> <<< 16142 1727204102.02785: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 16142 1727204102.02790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 16142 1727204102.02802: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a196fd0> <<< 16142 1727204102.02827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 16142 1727204102.02830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 16142 1727204102.02873: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 16142 1727204102.02876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 16142 1727204102.02897: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 16142 1727204102.02900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 16142 1727204102.02915: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 16142 1727204102.02927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1e3d30> <<< 16142 1727204102.03009: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1b5d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1b5400> <<< 16142 1727204102.03013: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a161b20> <<< 16142 1727204102.03047: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.03050: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1b5520> <<< 16142 1727204102.03079: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1b5550> <<< 16142 1727204102.03104: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 16142 1727204102.03107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 16142 1727204102.03133: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 16142 1727204102.03166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 16142 1727204102.03231: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a0f6fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1f5250> <<< 16142 1727204102.03253: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 16142 1727204102.03267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 16142 1727204102.03314: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.03349: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a0f4850> <<< 16142 1727204102.03352: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1f53d0> <<< 16142 1727204102.03354: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 16142 1727204102.03390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204102.03417: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 16142 1727204102.03420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 16142 1727204102.03482: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1f5ca0> <<< 16142 1727204102.03605: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a0f47f0> <<< 16142 1727204102.03703: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.03706: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a18ec10> <<< 16142 1727204102.03731: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1f5fa0> <<< 16142 1727204102.03784: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1f5550> <<< 16142 1727204102.03788: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ee910> <<< 16142 1727204102.03814: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 16142 1727204102.03821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 16142 1727204102.03836: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 16142 1727204102.03852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 16142 1727204102.03898: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.03902: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a0e8940> <<< 16142 1727204102.04069: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.04090: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a106d90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a0f3580> <<< 16142 1727204102.04127: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.04130: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a0e8ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a0f39a0> # zipimport: zlib available <<< 16142 1727204102.04147: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 16142 1727204102.04172: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.04242: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.04329: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.04333: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.04337: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 16142 1727204102.04355: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16142 1727204102.04372: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 16142 1727204102.04384: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.04475: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.04575: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.05016: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.05469: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 16142 1727204102.05498: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 16142 1727204102.05501: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 16142 1727204102.05523: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 16142 1727204102.05526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204102.05577: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a12f7f0> <<< 16142 1727204102.05647: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 16142 1727204102.05652: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1348b0> <<< 16142 1727204102.05669: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c93970> <<< 16142 1727204102.05718: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 16142 1727204102.05721: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.05736: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.05751: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 16142 1727204102.05769: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.05887: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.06014: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 16142 1727204102.06042: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a16c730> # zipimport: zlib available <<< 16142 1727204102.06435: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.06797: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.06856: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.06929: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 16142 1727204102.06932: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.06959: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.06998: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 16142 1727204102.07001: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07057: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07144: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 16142 1727204102.07147: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07154: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07171: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 16142 1727204102.07207: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07249: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 16142 1727204102.07252: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07434: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07625: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 16142 1727204102.07660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc'<<< 16142 1727204102.07666: stdout chunk (state=3): >>> import '_ast' # <<< 16142 1727204102.07738: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1b1370> <<< 16142 1727204102.07741: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07802: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07879: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 16142 1727204102.07883: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py <<< 16142 1727204102.07885: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 16142 1727204102.07901: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 16142 1727204102.07904: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07938: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.07979: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 16142 1727204102.07983: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.08019: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.08055: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.08151: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.08214: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 16142 1727204102.08232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204102.08309: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a122550> <<< 16142 1727204102.08397: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099b0feb0> <<< 16142 1727204102.08435: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py <<< 16142 1727204102.08438: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 16142 1727204102.08494: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.08549: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.08575: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.08621: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 16142 1727204102.08624: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 16142 1727204102.08641: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 16142 1727204102.08683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 16142 1727204102.08699: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 16142 1727204102.08723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 16142 1727204102.08801: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1297f0> <<< 16142 1727204102.08851: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a127790> <<< 16142 1727204102.08910: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a122b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 16142 1727204102.08943: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.08962: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 16142 1727204102.09046: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 16142 1727204102.09049: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09051: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09074: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 16142 1727204102.09077: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09124: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09190: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09193: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09214: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09251: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09290: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09323: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09352: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 16142 1727204102.09365: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09429: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09493: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09509: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09550: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 16142 1727204102.09699: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09836: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09875: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.09922: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204102.09946: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 16142 1727204102.09950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 16142 1727204102.09976: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 16142 1727204102.10012: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c59370> <<< 16142 1727204102.10042: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 16142 1727204102.10045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 16142 1727204102.10057: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 16142 1727204102.10089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 16142 1727204102.10108: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 16142 1727204102.10127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 16142 1727204102.10130: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c76580> <<< 16142 1727204102.10178: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7099c764f0> <<< 16142 1727204102.10238: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c49280> <<< 16142 1727204102.10252: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c59970> <<< 16142 1727204102.10276: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099a147f0> <<< 16142 1727204102.10294: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099a14b20> <<< 16142 1727204102.10309: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 16142 1727204102.10322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 16142 1727204102.10350: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 16142 1727204102.10353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 16142 1727204102.10392: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7099cbc0a0> <<< 16142 1727204102.10397: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c57f70> <<< 16142 1727204102.10428: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 16142 1727204102.10431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 16142 1727204102.10457: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099cbc190> <<< 16142 1727204102.10482: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 16142 1727204102.10497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 16142 1727204102.10531: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.10534: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7099a7cfd0> <<< 16142 1727204102.10556: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099ca5820> <<< 16142 1727204102.10586: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099a14d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 16142 1727204102.10609: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 16142 1727204102.10614: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.10635: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 16142 1727204102.10647: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.10696: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.10753: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 16142 1727204102.10757: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.10797: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.10840: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 16142 1727204102.10856: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16142 1727204102.10882: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 16142 1727204102.10885: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.10911: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.10940: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 16142 1727204102.10952: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.10990: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.11038: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 16142 1727204102.11041: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.11086: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.11124: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 16142 1727204102.11127: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.11182: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.11230: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.11279: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.11326: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 16142 1727204102.11340: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.11722: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12097: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 16142 1727204102.12101: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12137: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12189: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12214: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12245: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 16142 1727204102.12260: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12284: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12323: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 16142 1727204102.12326: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12372: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12426: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 16142 1727204102.12429: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12450: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12487: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 16142 1727204102.12490: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12516: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12553: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 16142 1727204102.12557: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12614: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 16142 1727204102.12713: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099964e80> <<< 16142 1727204102.12732: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 16142 1727204102.12759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 16142 1727204102.12917: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70999649d0> <<< 16142 1727204102.12921: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 16142 1727204102.12923: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.12979: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.13039: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 16142 1727204102.13043: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.13119: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.13196: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 16142 1727204102.13258: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.13333: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 16142 1727204102.13368: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.13412: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 16142 1727204102.13441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 16142 1727204102.13588: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.13592: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f70999da490> <<< 16142 1727204102.13827: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099972850> <<< 16142 1727204102.13831: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 16142 1727204102.13882: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.13934: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 16142 1727204102.13937: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14003: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14084: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14169: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14313: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 16142 1727204102.14329: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14356: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14401: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 16142 1727204102.14404: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14443: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14496: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 16142 1727204102.14499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 16142 1727204102.14545: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204102.14549: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f70999d7670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70999d7220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 16142 1727204102.14569: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14583: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 16142 1727204102.14601: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14634: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14676: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 16142 1727204102.14809: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.14944: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 16142 1727204102.14947: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15027: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15112: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15147: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15185: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 16142 1727204102.15199: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15277: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15293: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15407: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15540: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 16142 1727204102.15543: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15647: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15760: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 16142 1727204102.15764: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15786: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.15820: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.16260: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.16678: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 16142 1727204102.16699: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.16779: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.16873: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 16142 1727204102.16959: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17051: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 16142 1727204102.17054: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17182: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17336: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 16142 1727204102.17341: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17343: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 16142 1727204102.17358: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17394: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17441: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 16142 1727204102.17444: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17528: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17608: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17785: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17956: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 16142 1727204102.17972: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.17999: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18040: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 16142 1727204102.18043: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18069: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18092: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 16142 1727204102.18095: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18159: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18219: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 16142 1727204102.18231: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18249: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18286: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 16142 1727204102.18289: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18334: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18393: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 16142 1727204102.18397: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18441: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18496: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 16142 1727204102.18498: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18717: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18943: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 16142 1727204102.18947: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.18990: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19047: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 16142 1727204102.19050: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19086: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19117: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 16142 1727204102.19128: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19152: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19194: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 16142 1727204102.19198: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19233: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19269: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 16142 1727204102.19278: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19339: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19416: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 16142 1727204102.19420: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19434: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 16142 1727204102.19454: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19492: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19537: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 16142 1727204102.19541: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19570: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19583: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19628: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19668: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19733: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19798: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 16142 1727204102.19802: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 16142 1727204102.19816: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19857: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.19906: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 16142 1727204102.19909: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20075: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20244: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 16142 1727204102.20247: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20285: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20334: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 16142 1727204102.20337: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20387: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20422: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 16142 1727204102.20435: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20501: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20586: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 16142 1727204102.20590: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20657: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.20743: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 16142 1727204102.20746: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 16142 1727204102.20826: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204102.21805: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 16142 1727204102.21838: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 16142 1727204102.21841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 16142 1727204102.21890: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709991b550> <<< 16142 1727204102.21893: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a137c10> <<< 16142 1727204102.21951: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099970670> <<< 16142 1727204102.22334: stdout chunk (state=3): >>>import 'gc' # <<< 16142 1727204102.24174: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 16142 1727204102.24212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 16142 1727204102.24216: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099970d00> <<< 16142 1727204102.24219: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 16142 1727204102.24238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 16142 1727204102.24256: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099972250> <<< 16142 1727204102.24326: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 16142 1727204102.24331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204102.24358: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 16142 1727204102.24363: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099779cd0> <<< 16142 1727204102.24386: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70997afeb0> <<< 16142 1727204102.24654: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 16142 1727204102.48896: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI<<< 16142 1727204102.48921: stdout chunk (state=3): >>>", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2798, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 734, "free": 2798}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 465, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264274272256, "block_size": 4096, "block_total": 65519355, "block_available": 64520086, "block_used": 999269, "inode_total": 131071472, "inode_available": 130998230, "inode_used": 73242, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "02", "epoch": "1727204102", "epoch_int": "1727204102", "date": "2024-09-24", "time": "14:55:02", "iso8601_micro": "2024-09-24T18:55:02.446022Z", "iso8601": "2024-09-24T18:55:02Z", "iso8601_basic": "20240924T145502446022", "iso8601_basic_short": "20240924T145<<< 16142 1727204102.48950: stdout chunk (state=3): >>>502", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_loadavg": {"1m": 0.47, "5m": 0.29, "15m": 0.14}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]<<< 16142 1727204102.48972: stdout chunk (state=3): >>>", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16142 1727204102.49497: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value <<< 16142 1727204102.49503: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 16142 1727204102.49507: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs <<< 16142 1727204102.49510: stdout chunk (state=3): >>># cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 16142 1727204102.49545: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib<<< 16142 1727204102.49551: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc<<< 16142 1727204102.49569: stdout chunk (state=3): >>> # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression <<< 16142 1727204102.49604: stdout chunk (state=3): >>># cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 <<< 16142 1727204102.49610: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 <<< 16142 1727204102.49618: stdout chunk (state=3): >>># cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ <<< 16142 1727204102.49676: stdout chunk (state=3): >>># destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 <<< 16142 1727204102.49681: stdout chunk (state=3): >>># cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat <<< 16142 1727204102.49684: stdout chunk (state=3): >>># destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters<<< 16142 1727204102.49688: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux <<< 16142 1727204102.49692: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 16142 1727204102.49719: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors<<< 16142 1727204102.49722: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info<<< 16142 1727204102.49824: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool <<< 16142 1727204102.49827: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai <<< 16142 1727204102.49832: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils <<< 16142 1727204102.49834: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl <<< 16142 1727204102.49838: stdout chunk (state=3): >>># cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass <<< 16142 1727204102.49840: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd <<< 16142 1727204102.49848: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin<<< 16142 1727204102.49852: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux <<< 16142 1727204102.49854: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base <<< 16142 1727204102.49907: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin <<< 16142 1727204102.49913: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly <<< 16142 1727204102.49915: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual <<< 16142 1727204102.49917: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd<<< 16142 1727204102.49921: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep <<< 16142 1727204102.49940: stdout chunk (state=3): >>># cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 16142 1727204102.50188: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 16142 1727204102.50211: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 16142 1727204102.50256: stdout chunk (state=3): >>># destroy zipimport <<< 16142 1727204102.50261: stdout chunk (state=3): >>># destroy _compression <<< 16142 1727204102.50265: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 <<< 16142 1727204102.50305: stdout chunk (state=3): >>># destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 16142 1727204102.50308: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 16142 1727204102.50311: stdout chunk (state=3): >>># destroy encodings <<< 16142 1727204102.50328: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 16142 1727204102.50376: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 16142 1727204102.50425: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 16142 1727204102.50428: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues <<< 16142 1727204102.50432: stdout chunk (state=3): >>># destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 16142 1727204102.50454: stdout chunk (state=3): >>># destroy queue <<< 16142 1727204102.50457: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 16142 1727204102.50488: stdout chunk (state=3): >>># destroy shlex <<< 16142 1727204102.50498: stdout chunk (state=3): >>># destroy datetime <<< 16142 1727204102.50501: stdout chunk (state=3): >>># destroy base64 <<< 16142 1727204102.50524: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 16142 1727204102.50583: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 16142 1727204102.50586: stdout chunk (state=3): >>># destroy glob <<< 16142 1727204102.50588: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 16142 1727204102.50590: stdout chunk (state=3): >>># destroy multiprocessing.connection <<< 16142 1727204102.50592: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util <<< 16142 1727204102.50594: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 16142 1727204102.50611: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios <<< 16142 1727204102.50637: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 16142 1727204102.50652: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 16142 1727204102.50674: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl <<< 16142 1727204102.50692: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 16142 1727204102.50708: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 16142 1727204102.50725: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 16142 1727204102.50751: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 16142 1727204102.50769: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 16142 1727204102.50783: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 <<< 16142 1727204102.50803: stdout chunk (state=3): >>># cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 16142 1727204102.50817: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 16142 1727204102.50844: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios <<< 16142 1727204102.50865: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma <<< 16142 1727204102.50878: stdout chunk (state=3): >>># destroy zlib # destroy _signal <<< 16142 1727204102.51021: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 16142 1727204102.51036: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 16142 1727204102.51054: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 16142 1727204102.51080: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 16142 1727204102.51097: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator <<< 16142 1727204102.51110: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _operator <<< 16142 1727204102.51122: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 16142 1727204102.51157: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 16142 1727204102.51445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204102.51539: stderr chunk (state=3): >>><<< 16142 1727204102.51542: stdout chunk (state=3): >>><<< 16142 1727204102.51885: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709adb3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709adb3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709adb3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad58490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad58940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad58670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad0f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad0f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad32850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad0f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad70880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad08d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad32d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ad58970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acadf10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acb40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709aca75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acae6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acad3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709ac31e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac31910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac31f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac31fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac420d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac89d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac82670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acb5e80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709ac42cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac892b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709ac952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acbba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a9723d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a9724c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac4af40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac44a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac44490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8a6220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a95d520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac44f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709acbb0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8b8b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8b8e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8c9790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8c9cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a857400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8b8f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8682e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8c9610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8683a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a884700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8849d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a8847c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a8848b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a884d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a88f250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a884940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a877a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709ac42610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a884af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f709a7ad6d0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ae820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1ae160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ae280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1aef70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ae4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1aed90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1aefd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ae100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1830d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a088340> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a088040> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a088ca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a196dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1963a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a196fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1e3d30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1b5d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1b5400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a161b20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1b5520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1b5550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a0f6fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1f5250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a0f4850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1f53d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1f5ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a0f47f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a18ec10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1f5fa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a1f5550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1ee910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a0e8940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a106d90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a0f3580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a0e8ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a0f39a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a12f7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1348b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c93970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a16c730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1b1370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709a122550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099b0feb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a1297f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a127790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a122b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c59370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c76580> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7099c764f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c49280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c59970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099a147f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099a14b20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7099cbc0a0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099c57f70> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099cbc190> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7099a7cfd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099ca5820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099a14d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099964e80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70999649d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f70999da490> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099972850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f70999d7670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70999d7220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_z3z7mggg/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f709991b550> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f709a137c10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099970670> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099970d00> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099972250> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7099779cd0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70997afeb0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2798, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 734, "free": 2798}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 465, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264274272256, "block_size": 4096, "block_total": 65519355, "block_available": 64520086, "block_used": 999269, "inode_total": 131071472, "inode_available": 130998230, "inode_used": 73242, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "02", "epoch": "1727204102", "epoch_int": "1727204102", "date": "2024-09-24", "time": "14:55:02", "iso8601_micro": "2024-09-24T18:55:02.446022Z", "iso8601": "2024-09-24T18:55:02Z", "iso8601_basic": "20240924T145502446022", "iso8601_basic_short": "20240924T145502", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_loadavg": {"1m": 0.47, "5m": 0.29, "15m": 0.14}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 16142 1727204102.54213: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204102.54217: _low_level_execute_command(): starting 16142 1727204102.54219: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204100.9412155-16348-101211323707141/ > /dev/null 2>&1 && sleep 0' 16142 1727204102.55545: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204102.55549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204102.55701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204102.55704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204102.55707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204102.55770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204102.55875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204102.55892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204102.55957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204102.57799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204102.57876: stderr chunk (state=3): >>><<< 16142 1727204102.57880: stdout chunk (state=3): >>><<< 16142 1727204102.58074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204102.58077: handler run complete 16142 1727204102.58080: variable 'ansible_facts' from source: unknown 16142 1727204102.58149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204102.58493: variable 'ansible_facts' from source: unknown 16142 1727204102.58590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204102.59119: attempt loop complete, returning result 16142 1727204102.59167: _execute() done 16142 1727204102.59274: dumping result to json 16142 1727204102.59307: done dumping result, returning 16142 1727204102.59384: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [0affcd87-79f5-fddd-f6c7-0000000001bc] 16142 1727204102.59394: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001bc ok: [managed-node2] 16142 1727204102.60482: no more pending results, returning what we have 16142 1727204102.60486: results queue empty 16142 1727204102.60487: checking for any_errors_fatal 16142 1727204102.60488: done checking for any_errors_fatal 16142 1727204102.60489: checking for max_fail_percentage 16142 1727204102.60490: done checking for max_fail_percentage 16142 1727204102.60491: checking to see if all hosts have failed and the running result is not ok 16142 1727204102.60492: done checking to see if all hosts have failed 16142 1727204102.60493: getting the remaining hosts for this loop 16142 1727204102.60495: done getting the remaining hosts for this loop 16142 1727204102.60499: getting the next task for host managed-node2 16142 1727204102.60507: done getting next task for host managed-node2 16142 1727204102.60509: ^ task is: TASK: meta (flush_handlers) 16142 1727204102.60511: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204102.60515: getting variables 16142 1727204102.60517: in VariableManager get_vars() 16142 1727204102.60544: Calling all_inventory to load vars for managed-node2 16142 1727204102.60547: Calling groups_inventory to load vars for managed-node2 16142 1727204102.60550: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204102.60561: Calling all_plugins_play to load vars for managed-node2 16142 1727204102.60567: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204102.60570: Calling groups_plugins_play to load vars for managed-node2 16142 1727204102.60762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204102.61278: done with get_vars() 16142 1727204102.61290: done getting variables 16142 1727204102.61667: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001bc 16142 1727204102.61673: WORKER PROCESS EXITING 16142 1727204102.61720: in VariableManager get_vars() 16142 1727204102.61732: Calling all_inventory to load vars for managed-node2 16142 1727204102.61734: Calling groups_inventory to load vars for managed-node2 16142 1727204102.61737: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204102.61742: Calling all_plugins_play to load vars for managed-node2 16142 1727204102.61745: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204102.61748: Calling groups_plugins_play to load vars for managed-node2 16142 1727204102.62443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204102.63087: done with get_vars() 16142 1727204102.63104: done queuing things up, now waiting for results queue to drain 16142 1727204102.63107: results queue empty 16142 1727204102.63108: checking for any_errors_fatal 16142 1727204102.63110: done checking for any_errors_fatal 16142 1727204102.63111: checking for max_fail_percentage 16142 1727204102.63112: done checking for max_fail_percentage 16142 1727204102.63118: checking to see if all hosts have failed and the running result is not ok 16142 1727204102.63119: done checking to see if all hosts have failed 16142 1727204102.63120: getting the remaining hosts for this loop 16142 1727204102.63121: done getting the remaining hosts for this loop 16142 1727204102.63124: getting the next task for host managed-node2 16142 1727204102.63129: done getting next task for host managed-node2 16142 1727204102.63132: ^ task is: TASK: Include the task 'el_repo_setup.yml' 16142 1727204102.63133: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204102.63136: getting variables 16142 1727204102.63137: in VariableManager get_vars() 16142 1727204102.63147: Calling all_inventory to load vars for managed-node2 16142 1727204102.63149: Calling groups_inventory to load vars for managed-node2 16142 1727204102.63151: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204102.63157: Calling all_plugins_play to load vars for managed-node2 16142 1727204102.63159: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204102.63161: Calling groups_plugins_play to load vars for managed-node2 16142 1727204102.63745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204102.64384: done with get_vars() 16142 1727204102.64397: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:11 Tuesday 24 September 2024 14:55:02 -0400 (0:00:01.786) 0:00:01.821 ***** 16142 1727204102.64711: entering _queue_task() for managed-node2/include_tasks 16142 1727204102.64714: Creating lock for include_tasks 16142 1727204102.65558: worker is 1 (out of 1 available) 16142 1727204102.65693: exiting _queue_task() for managed-node2/include_tasks 16142 1727204102.65708: done queuing things up, now waiting for results queue to drain 16142 1727204102.65711: waiting for pending results... 16142 1727204102.66488: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 16142 1727204102.66695: in run() - task 0affcd87-79f5-fddd-f6c7-000000000006 16142 1727204102.66718: variable 'ansible_search_path' from source: unknown 16142 1727204102.66762: calling self._execute() 16142 1727204102.66842: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204102.66853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204102.66866: variable 'omit' from source: magic vars 16142 1727204102.66980: _execute() done 16142 1727204102.66999: dumping result to json 16142 1727204102.67007: done dumping result, returning 16142 1727204102.67018: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-fddd-f6c7-000000000006] 16142 1727204102.67029: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000006 16142 1727204102.67174: no more pending results, returning what we have 16142 1727204102.67180: in VariableManager get_vars() 16142 1727204102.67216: Calling all_inventory to load vars for managed-node2 16142 1727204102.67219: Calling groups_inventory to load vars for managed-node2 16142 1727204102.67223: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204102.67239: Calling all_plugins_play to load vars for managed-node2 16142 1727204102.67243: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204102.67247: Calling groups_plugins_play to load vars for managed-node2 16142 1727204102.67436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204102.67675: done with get_vars() 16142 1727204102.67687: variable 'ansible_search_path' from source: unknown 16142 1727204102.67706: we have included files to process 16142 1727204102.67707: generating all_blocks data 16142 1727204102.67709: done generating all_blocks data 16142 1727204102.67710: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 16142 1727204102.67711: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 16142 1727204102.67714: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 16142 1727204102.68383: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000006 16142 1727204102.68387: WORKER PROCESS EXITING 16142 1727204102.69190: in VariableManager get_vars() 16142 1727204102.69323: done with get_vars() 16142 1727204102.69339: done processing included file 16142 1727204102.69341: iterating over new_blocks loaded from include file 16142 1727204102.69343: in VariableManager get_vars() 16142 1727204102.69354: done with get_vars() 16142 1727204102.69356: filtering new block on tags 16142 1727204102.69375: done filtering new block on tags 16142 1727204102.69379: in VariableManager get_vars() 16142 1727204102.69390: done with get_vars() 16142 1727204102.69392: filtering new block on tags 16142 1727204102.69407: done filtering new block on tags 16142 1727204102.69410: in VariableManager get_vars() 16142 1727204102.69420: done with get_vars() 16142 1727204102.69422: filtering new block on tags 16142 1727204102.69549: done filtering new block on tags 16142 1727204102.69552: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 16142 1727204102.69559: extending task lists for all hosts with included blocks 16142 1727204102.69611: done extending task lists 16142 1727204102.69612: done processing included files 16142 1727204102.69613: results queue empty 16142 1727204102.69614: checking for any_errors_fatal 16142 1727204102.69615: done checking for any_errors_fatal 16142 1727204102.69616: checking for max_fail_percentage 16142 1727204102.69617: done checking for max_fail_percentage 16142 1727204102.69618: checking to see if all hosts have failed and the running result is not ok 16142 1727204102.69618: done checking to see if all hosts have failed 16142 1727204102.69619: getting the remaining hosts for this loop 16142 1727204102.69620: done getting the remaining hosts for this loop 16142 1727204102.69623: getting the next task for host managed-node2 16142 1727204102.69627: done getting next task for host managed-node2 16142 1727204102.69629: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 16142 1727204102.69631: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204102.69633: getting variables 16142 1727204102.69634: in VariableManager get_vars() 16142 1727204102.69756: Calling all_inventory to load vars for managed-node2 16142 1727204102.69759: Calling groups_inventory to load vars for managed-node2 16142 1727204102.69761: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204102.69769: Calling all_plugins_play to load vars for managed-node2 16142 1727204102.69771: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204102.69774: Calling groups_plugins_play to load vars for managed-node2 16142 1727204102.70053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204102.70462: done with get_vars() 16142 1727204102.70475: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.059) 0:00:01.883 ***** 16142 1727204102.70670: entering _queue_task() for managed-node2/setup 16142 1727204102.71437: worker is 1 (out of 1 available) 16142 1727204102.71451: exiting _queue_task() for managed-node2/setup 16142 1727204102.71466: done queuing things up, now waiting for results queue to drain 16142 1727204102.71468: waiting for pending results... 16142 1727204102.72621: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 16142 1727204102.72903: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001cd 16142 1727204102.73144: variable 'ansible_search_path' from source: unknown 16142 1727204102.73151: variable 'ansible_search_path' from source: unknown 16142 1727204102.73191: calling self._execute() 16142 1727204102.73321: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204102.73619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204102.73636: variable 'omit' from source: magic vars 16142 1727204102.74733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204102.81053: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204102.81243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204102.81356: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204102.81570: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204102.81602: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204102.81720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204102.81893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204102.81997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204102.82060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204102.82098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204102.82316: variable 'ansible_facts' from source: unknown 16142 1727204102.82389: variable 'network_test_required_facts' from source: task vars 16142 1727204102.82436: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 16142 1727204102.82449: variable 'omit' from source: magic vars 16142 1727204102.82496: variable 'omit' from source: magic vars 16142 1727204102.82539: variable 'omit' from source: magic vars 16142 1727204102.82567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204102.82599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204102.82627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204102.82649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204102.82667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204102.82700: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204102.82708: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204102.82716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204102.82818: Set connection var ansible_timeout to 10 16142 1727204102.82827: Set connection var ansible_connection to ssh 16142 1727204102.82841: Set connection var ansible_shell_type to sh 16142 1727204102.82851: Set connection var ansible_shell_executable to /bin/sh 16142 1727204102.82861: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204102.82876: Set connection var ansible_pipelining to False 16142 1727204102.82903: variable 'ansible_shell_executable' from source: unknown 16142 1727204102.82911: variable 'ansible_connection' from source: unknown 16142 1727204102.82918: variable 'ansible_module_compression' from source: unknown 16142 1727204102.82925: variable 'ansible_shell_type' from source: unknown 16142 1727204102.82932: variable 'ansible_shell_executable' from source: unknown 16142 1727204102.82942: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204102.82949: variable 'ansible_pipelining' from source: unknown 16142 1727204102.82955: variable 'ansible_timeout' from source: unknown 16142 1727204102.82963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204102.83117: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204102.83133: variable 'omit' from source: magic vars 16142 1727204102.83144: starting attempt loop 16142 1727204102.83150: running the handler 16142 1727204102.83174: _low_level_execute_command(): starting 16142 1727204102.83186: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204102.83949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204102.83963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204102.83983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204102.84114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204102.84250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204102.84267: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204102.84281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204102.84298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204102.84309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204102.84318: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204102.84332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204102.84345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204102.84360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204102.84376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204102.84386: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204102.84399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204102.84476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204102.84498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204102.84513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204102.84590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204102.86435: stdout chunk (state=3): >>>/root <<< 16142 1727204102.86591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204102.86678: stderr chunk (state=3): >>><<< 16142 1727204102.86682: stdout chunk (state=3): >>><<< 16142 1727204102.86811: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204102.86815: _low_level_execute_command(): starting 16142 1727204102.86819: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140 `" && echo ansible-tmp-1727204102.8670805-16432-261786856753140="` echo /root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140 `" ) && sleep 0' 16142 1727204102.89167: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204102.89172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204102.89428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204102.89435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204102.89438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204102.89706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204102.89710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204102.89718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204102.89775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204102.91614: stdout chunk (state=3): >>>ansible-tmp-1727204102.8670805-16432-261786856753140=/root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140 <<< 16142 1727204102.91712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204102.91803: stderr chunk (state=3): >>><<< 16142 1727204102.91806: stdout chunk (state=3): >>><<< 16142 1727204102.92073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204102.8670805-16432-261786856753140=/root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204102.92077: variable 'ansible_module_compression' from source: unknown 16142 1727204102.92079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16142 1727204102.92081: variable 'ansible_facts' from source: unknown 16142 1727204102.92173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140/AnsiballZ_setup.py 16142 1727204102.93318: Sending initial data 16142 1727204102.93322: Sent initial data (154 bytes) 16142 1727204102.97254: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204102.97418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204102.97434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204102.97454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204102.97504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204102.97521: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204102.97535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204102.97555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204102.97568: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204102.97578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204102.97587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204102.97599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204102.97616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204102.97631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204102.97641: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204102.97654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204102.97742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204102.97872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204102.97892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204102.97971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204102.99774: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204102.99800: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204102.99842: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpma3aq7rz /root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140/AnsiballZ_setup.py <<< 16142 1727204102.99880: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204103.02741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204103.02921: stderr chunk (state=3): >>><<< 16142 1727204103.02925: stdout chunk (state=3): >>><<< 16142 1727204103.02927: done transferring module to remote 16142 1727204103.02929: _low_level_execute_command(): starting 16142 1727204103.02931: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140/ /root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140/AnsiballZ_setup.py && sleep 0' 16142 1727204103.04734: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204103.04749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.04762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204103.04789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.04838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204103.04911: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204103.04926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.04943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204103.04954: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204103.04965: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204103.04978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.04989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204103.05007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.05022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204103.05032: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204103.05044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.05238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204103.05267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204103.05286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204103.05366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204103.07240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204103.07244: stdout chunk (state=3): >>><<< 16142 1727204103.07246: stderr chunk (state=3): >>><<< 16142 1727204103.07356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204103.07360: _low_level_execute_command(): starting 16142 1727204103.07363: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140/AnsiballZ_setup.py && sleep 0' 16142 1727204103.09243: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204103.09381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.09398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204103.09419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.09468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204103.09486: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204103.09500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.09519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204103.09532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204103.09544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204103.09557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.09574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204103.09597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.09612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204103.09623: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204103.09637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.09830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204103.09848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204103.09863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204103.10032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204103.12013: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 16142 1727204103.12017: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 16142 1727204103.12073: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 16142 1727204103.12110: stdout chunk (state=3): >>>import 'posix' # <<< 16142 1727204103.12144: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 16142 1727204103.12147: stdout chunk (state=3): >>># installing zipimport hook <<< 16142 1727204103.12185: stdout chunk (state=3): >>>import 'time' # <<< 16142 1727204103.12201: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 16142 1727204103.12248: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204103.12267: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 16142 1727204103.12292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 16142 1727204103.12329: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6843dc0> <<< 16142 1727204103.12362: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 16142 1727204103.12384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 16142 1727204103.12387: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6843b20> <<< 16142 1727204103.12418: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 16142 1727204103.12435: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6843ac0> <<< 16142 1727204103.12451: stdout chunk (state=3): >>>import '_signal' # <<< 16142 1727204103.12481: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 16142 1727204103.12495: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d8490> <<< 16142 1727204103.12523: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 16142 1727204103.12551: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 16142 1727204103.12575: stdout chunk (state=3): >>>import '_abc' # <<< 16142 1727204103.12578: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d8940> <<< 16142 1727204103.12599: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d8670> <<< 16142 1727204103.12633: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 16142 1727204103.12649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 16142 1727204103.12663: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 16142 1727204103.12687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 16142 1727204103.12700: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 16142 1727204103.12726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 16142 1727204103.12754: stdout chunk (state=3): >>>import '_stat' # <<< 16142 1727204103.12757: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca658f190> <<< 16142 1727204103.12775: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 16142 1727204103.12793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 16142 1727204103.12871: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca658f220> <<< 16142 1727204103.12902: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 16142 1727204103.12907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 16142 1727204103.12939: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 16142 1727204103.12942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65b2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca658f940> <<< 16142 1727204103.12977: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65f0880> <<< 16142 1727204103.13011: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 16142 1727204103.13014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 16142 1727204103.13016: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6588d90> <<< 16142 1727204103.13062: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 16142 1727204103.13077: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65b2d90> <<< 16142 1727204103.13136: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d8970> <<< 16142 1727204103.13168: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 16142 1727204103.13499: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 16142 1727204103.13514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 16142 1727204103.13541: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 16142 1727204103.13570: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 16142 1727204103.13581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 16142 1727204103.13612: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 16142 1727204103.13617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 16142 1727204103.13633: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6552f10> <<< 16142 1727204103.13683: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65590a0> <<< 16142 1727204103.13703: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 16142 1727204103.13706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 16142 1727204103.13733: stdout chunk (state=3): >>>import '_sre' # <<< 16142 1727204103.13748: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 16142 1727204103.13775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 16142 1727204103.13798: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 16142 1727204103.13801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 16142 1727204103.13811: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca654c5b0> <<< 16142 1727204103.13838: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65536a0> <<< 16142 1727204103.13841: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65523d0> <<< 16142 1727204103.13870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 16142 1727204103.13940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 16142 1727204103.13961: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 16142 1727204103.13999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204103.14013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 16142 1727204103.14056: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca643ae80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca643a970> <<< 16142 1727204103.14074: stdout chunk (state=3): >>>import 'itertools' # <<< 16142 1727204103.14107: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 16142 1727204103.14110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca643af70> <<< 16142 1727204103.14136: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 16142 1727204103.14139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 16142 1727204103.14170: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca643adc0> <<< 16142 1727204103.14203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 16142 1727204103.14208: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644a130> <<< 16142 1727204103.14224: stdout chunk (state=3): >>>import '_collections' # <<< 16142 1727204103.14273: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca652edf0> import '_functools' # <<< 16142 1727204103.14304: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65276d0> <<< 16142 1727204103.14372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 16142 1727204103.14375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca653a730> <<< 16142 1727204103.14379: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca655aeb0> <<< 16142 1727204103.14396: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 16142 1727204103.14439: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca644ad30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca652e310> <<< 16142 1727204103.14484: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.14488: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca653a340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6560a60> <<< 16142 1727204103.14518: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 16142 1727204103.14521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 16142 1727204103.14555: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204103.14580: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 16142 1727204103.14584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 16142 1727204103.14602: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644af10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644ae50> <<< 16142 1727204103.14641: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 16142 1727204103.14644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644adc0> <<< 16142 1727204103.14690: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py<<< 16142 1727204103.14693: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 16142 1727204103.14698: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 16142 1727204103.14709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 16142 1727204103.14731: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 16142 1727204103.14782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 16142 1727204103.14813: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 16142 1727204103.14816: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca641e430> <<< 16142 1727204103.14829: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 16142 1727204103.14844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 16142 1727204103.14885: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca641e520> <<< 16142 1727204103.15005: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6453fa0> <<< 16142 1727204103.15050: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644daf0> <<< 16142 1727204103.15053: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644d4c0> <<< 16142 1727204103.15085: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 16142 1727204103.15089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 16142 1727204103.15121: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 16142 1727204103.15136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 16142 1727204103.15157: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 16142 1727204103.15171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6343280> <<< 16142 1727204103.15217: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6409dc0> <<< 16142 1727204103.15266: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644df70> <<< 16142 1727204103.15269: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65600d0> <<< 16142 1727204103.15291: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 16142 1727204103.15313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 16142 1727204103.15346: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 16142 1727204103.15349: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6354bb0> <<< 16142 1727204103.15362: stdout chunk (state=3): >>>import 'errno' # <<< 16142 1727204103.15398: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6354ee0> <<< 16142 1727204103.15423: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 16142 1727204103.15426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 16142 1727204103.15456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 16142 1727204103.15471: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca63667f0> <<< 16142 1727204103.15491: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 16142 1727204103.15531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 16142 1727204103.15555: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6366d30> <<< 16142 1727204103.15598: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.15602: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca62f4460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6354fd0> <<< 16142 1727204103.15630: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 16142 1727204103.15695: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6304340> <<< 16142 1727204103.15698: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6366670> <<< 16142 1727204103.15701: stdout chunk (state=3): >>>import 'pwd' # <<< 16142 1727204103.15732: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6304400> <<< 16142 1727204103.15776: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644aa90> <<< 16142 1727204103.15794: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 16142 1727204103.15809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 16142 1727204103.15831: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 16142 1727204103.15850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 16142 1727204103.15883: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6320760> <<< 16142 1727204103.15904: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 16142 1727204103.15936: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.15939: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6320a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6320820> <<< 16142 1727204103.15960: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6320910> <<< 16142 1727204103.15994: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 16142 1727204103.16193: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6320d60> <<< 16142 1727204103.16231: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca632a2b0> <<< 16142 1727204103.16234: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca63209a0> <<< 16142 1727204103.16254: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6314af0> <<< 16142 1727204103.16282: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644a670> <<< 16142 1727204103.16304: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 16142 1727204103.16360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 16142 1727204103.16396: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6320b50> <<< 16142 1727204103.16540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 16142 1727204103.16557: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdca6243730> <<< 16142 1727204103.16792: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip' <<< 16142 1727204103.16795: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.16884: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.16916: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/__init__.py <<< 16142 1727204103.16921: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.16936: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.16947: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 16142 1727204103.16972: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.18236: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.19188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180880> <<< 16142 1727204103.19204: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204103.19232: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 16142 1727204103.19258: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 16142 1727204103.19291: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6180160> <<< 16142 1727204103.19340: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180280> <<< 16142 1727204103.19384: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180fd0> <<< 16142 1727204103.19388: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 16142 1727204103.19450: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61804f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180df0> <<< 16142 1727204103.19455: stdout chunk (state=3): >>>import 'atexit' # <<< 16142 1727204103.19487: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6180580> <<< 16142 1727204103.19504: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 16142 1727204103.19535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 16142 1727204103.19580: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180100> <<< 16142 1727204103.19594: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 16142 1727204103.19610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 16142 1727204103.19634: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 16142 1727204103.19654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 16142 1727204103.19684: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 16142 1727204103.19772: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bc0070> <<< 16142 1727204103.19817: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.19821: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b093a0> <<< 16142 1727204103.19846: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.19849: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b090a0> <<< 16142 1727204103.19868: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 16142 1727204103.19880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 16142 1727204103.19918: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5b09d00> <<< 16142 1727204103.19930: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6168dc0> <<< 16142 1727204103.20093: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61683a0> <<< 16142 1727204103.20115: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 16142 1727204103.20143: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6168f40> <<< 16142 1727204103.20163: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 16142 1727204103.20179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 16142 1727204103.20214: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 16142 1727204103.20218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 16142 1727204103.20233: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 16142 1727204103.20249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 16142 1727204103.20270: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 16142 1727204103.20286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 16142 1727204103.20288: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61b7e80> <<< 16142 1727204103.20354: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5beed90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bee460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca617dac0> <<< 16142 1727204103.20391: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5bee580> <<< 16142 1727204103.20428: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bee5b0> <<< 16142 1727204103.20458: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 16142 1727204103.20461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 16142 1727204103.20486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 16142 1727204103.20528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 16142 1727204103.20598: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.20602: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b74f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61c92b0> <<< 16142 1727204103.20617: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 16142 1727204103.20631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 16142 1727204103.20692: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.20695: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b717f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61c9430> <<< 16142 1727204103.20722: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 16142 1727204103.20756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204103.20784: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 16142 1727204103.20796: stdout chunk (state=3): >>>import '_string' # <<< 16142 1727204103.20851: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61c9c40> <<< 16142 1727204103.20988: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5b71790> <<< 16142 1727204103.21090: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.21093: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca61c9100> <<< 16142 1727204103.21125: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.21128: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca61c95b0> <<< 16142 1727204103.21171: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.21174: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca61c9f70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61c2970> <<< 16142 1727204103.21203: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 16142 1727204103.21221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 16142 1727204103.21237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 16142 1727204103.21290: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b678e0> <<< 16142 1727204103.21487: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.21490: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b85df0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5b70520> <<< 16142 1727204103.21526: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b67e80> <<< 16142 1727204103.21531: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5b70940> # zipimport: zlib available <<< 16142 1727204103.21549: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 16142 1727204103.21572: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.21640: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.21737: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.21742: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.21745: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 16142 1727204103.21747: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.21760: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 16142 1727204103.21782: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.21870: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.21971: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.22713: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.23258: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 16142 1727204103.23262: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 16142 1727204103.23266: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 16142 1727204103.23287: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 16142 1727204103.23301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204103.23369: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.23373: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b80790> <<< 16142 1727204103.23465: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 16142 1727204103.23471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bbf850> <<< 16142 1727204103.23483: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca574afd0> <<< 16142 1727204103.23535: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 16142 1727204103.23546: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.23568: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.23586: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 16142 1727204103.23589: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.23778: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.24045: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 16142 1727204103.24050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 16142 1727204103.24077: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bf22e0> <<< 16142 1727204103.24080: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.24703: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25304: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25458: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 16142 1727204103.25461: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25498: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 16142 1727204103.25501: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25559: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25650: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 16142 1727204103.25654: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25657: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25680: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 16142 1727204103.25683: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25715: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25757: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 16142 1727204103.25761: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.25946: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26141: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 16142 1727204103.26167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 16142 1727204103.26189: stdout chunk (state=3): >>>import '_ast' # <<< 16142 1727204103.26266: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6186ca0> <<< 16142 1727204103.26270: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26332: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26402: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 16142 1727204103.26407: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 16142 1727204103.26437: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26476: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26511: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 16142 1727204103.26514: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26555: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26594: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26682: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.26742: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 16142 1727204103.26770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204103.26847: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5ba3c40> <<< 16142 1727204103.26938: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6186be0> <<< 16142 1727204103.26987: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 16142 1727204103.26990: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27038: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27100: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27116: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27159: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 16142 1727204103.27173: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 16142 1727204103.27196: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 16142 1727204103.27232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 16142 1727204103.27251: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 16142 1727204103.27277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 16142 1727204103.27353: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bb6910> <<< 16142 1727204103.27395: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6154b50> <<< 16142 1727204103.27458: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca55a57f0> <<< 16142 1727204103.27462: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 16142 1727204103.27487: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27514: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 16142 1727204103.27595: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 16142 1727204103.27599: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27619: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 16142 1727204103.27632: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27687: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27754: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27757: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27785: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27823: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27861: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27891: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.27929: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 16142 1727204103.27933: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.28003: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.28070: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.28092: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.28120: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 16142 1727204103.28133: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.28276: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.28424: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.28452: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.28502: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204103.28545: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 16142 1727204103.28548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 16142 1727204103.28577: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 16142 1727204103.28613: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca54a7100> <<< 16142 1727204103.28619: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 16142 1727204103.28658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 16142 1727204103.29204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca570ca90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca570ca00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca56dfdc0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca56df790> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca57294c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5729d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca56efee0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca56ef9d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca56ef1f0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5509280> <<< 16142 1727204103.29238: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61d2a30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5729070><<< 16142 1727204103.29260: stdout chunk (state=3): >>> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py<<< 16142 1727204103.29263: stdout chunk (state=3): >>> <<< 16142 1727204103.29289: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 16142 1727204103.29317: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.29339: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.29361: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py<<< 16142 1727204103.29366: stdout chunk (state=3): >>> <<< 16142 1727204103.29392: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.29473: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.29552: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 16142 1727204103.29582: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.29585: stdout chunk (state=3): >>> <<< 16142 1727204103.29665: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.29670: stdout chunk (state=3): >>> <<< 16142 1727204103.29738: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py<<< 16142 1727204103.29741: stdout chunk (state=3): >>> <<< 16142 1727204103.29743: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.29771: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.29775: stdout chunk (state=3): >>> <<< 16142 1727204103.29796: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py<<< 16142 1727204103.29801: stdout chunk (state=3): >>> <<< 16142 1727204103.29819: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.29822: stdout chunk (state=3): >>> <<< 16142 1727204103.29872: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.29877: stdout chunk (state=3): >>> <<< 16142 1727204103.29924: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py<<< 16142 1727204103.29928: stdout chunk (state=3): >>> <<< 16142 1727204103.29947: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.29950: stdout chunk (state=3): >>> <<< 16142 1727204103.30017: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.30020: stdout chunk (state=3): >>> <<< 16142 1727204103.30072: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py<<< 16142 1727204103.30076: stdout chunk (state=3): >>> <<< 16142 1727204103.30100: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.30103: stdout chunk (state=3): >>> <<< 16142 1727204103.30158: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.30161: stdout chunk (state=3): >>> <<< 16142 1727204103.30233: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py<<< 16142 1727204103.30238: stdout chunk (state=3): >>> <<< 16142 1727204103.30241: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.30254: stdout chunk (state=3): >>> <<< 16142 1727204103.30327: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.30411: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.30414: stdout chunk (state=3): >>> <<< 16142 1727204103.30520: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.30571: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py<<< 16142 1727204103.30576: stdout chunk (state=3): >>> <<< 16142 1727204103.30589: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 16142 1727204103.30613: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.30617: stdout chunk (state=3): >>> <<< 16142 1727204103.31283: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.31286: stdout chunk (state=3): >>> <<< 16142 1727204103.31903: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py<<< 16142 1727204103.31907: stdout chunk (state=3): >>> <<< 16142 1727204103.31910: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.31912: stdout chunk (state=3): >>> <<< 16142 1727204103.31982: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.31985: stdout chunk (state=3): >>> <<< 16142 1727204103.32060: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32106: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32159: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py<<< 16142 1727204103.32164: stdout chunk (state=3): >>> import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py<<< 16142 1727204103.32192: stdout chunk (state=3): >>> <<< 16142 1727204103.32195: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32242: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.32245: stdout chunk (state=3): >>> <<< 16142 1727204103.32295: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py<<< 16142 1727204103.32300: stdout chunk (state=3): >>> <<< 16142 1727204103.32313: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32395: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.32399: stdout chunk (state=3): >>> <<< 16142 1727204103.32462: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py<<< 16142 1727204103.32469: stdout chunk (state=3): >>> <<< 16142 1727204103.32484: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32528: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32575: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py<<< 16142 1727204103.32578: stdout chunk (state=3): >>> <<< 16142 1727204103.32594: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32636: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32691: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py<<< 16142 1727204103.32697: stdout chunk (state=3): >>> <<< 16142 1727204103.32709: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.32819: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.32822: stdout chunk (state=3): >>> <<< 16142 1727204103.32936: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py<<< 16142 1727204103.32939: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc'<<< 16142 1727204103.32942: stdout chunk (state=3): >>> <<< 16142 1727204103.32988: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca53f9ee0><<< 16142 1727204103.32991: stdout chunk (state=3): >>> <<< 16142 1727204103.33022: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py<<< 16142 1727204103.33025: stdout chunk (state=3): >>> <<< 16142 1727204103.33073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc'<<< 16142 1727204103.33080: stdout chunk (state=3): >>> <<< 16142 1727204103.33340: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca53f99d0><<< 16142 1727204103.33344: stdout chunk (state=3): >>> <<< 16142 1727204103.33348: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py<<< 16142 1727204103.33351: stdout chunk (state=3): >>> <<< 16142 1727204103.33366: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.33458: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.33462: stdout chunk (state=3): >>> <<< 16142 1727204103.33558: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py<<< 16142 1727204103.33561: stdout chunk (state=3): >>> <<< 16142 1727204103.33566: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.33578: stdout chunk (state=3): >>> <<< 16142 1727204103.33690: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.33817: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py<<< 16142 1727204103.33822: stdout chunk (state=3): >>> <<< 16142 1727204103.33841: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.33925: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.34027: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 16142 1727204103.34053: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.34057: stdout chunk (state=3): >>> <<< 16142 1727204103.34112: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.34115: stdout chunk (state=3): >>> <<< 16142 1727204103.34189: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py<<< 16142 1727204103.34192: stdout chunk (state=3): >>> <<< 16142 1727204103.34222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc'<<< 16142 1727204103.34226: stdout chunk (state=3): >>> <<< 16142 1727204103.34442: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204103.34447: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so'<<< 16142 1727204103.34450: stdout chunk (state=3): >>> import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5422040><<< 16142 1727204103.34462: stdout chunk (state=3): >>> <<< 16142 1727204103.34854: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca57298e0> <<< 16142 1727204103.34872: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 16142 1727204103.34900: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.34903: stdout chunk (state=3): >>> <<< 16142 1727204103.34980: stdout chunk (state=3): >>># zipimport: zlib available<<< 16142 1727204103.34984: stdout chunk (state=3): >>> <<< 16142 1727204103.35055: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py<<< 16142 1727204103.35059: stdout chunk (state=3): >>> <<< 16142 1727204103.35077: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.35190: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.35293: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.35441: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.35638: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 16142 1727204103.35650: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.35697: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36166: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca546ddf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca546d580> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available <<< 16142 1727204103.36300: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 16142 1727204103.36303: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36387: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36473: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36505: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36557: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 16142 1727204103.36560: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 16142 1727204103.36563: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36639: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36661: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36778: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.36900: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 16142 1727204103.36918: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.37014: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.37123: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 16142 1727204103.37126: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.37157: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.37188: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.37617: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38040: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 16142 1727204103.38043: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 16142 1727204103.38046: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38130: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38229: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 16142 1727204103.38232: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38311: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38400: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 16142 1727204103.38403: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38533: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38658: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 16142 1727204103.38691: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38695: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38698: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 16142 1727204103.38713: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38746: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38793: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 16142 1727204103.38796: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38877: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.38958: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39130: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39295: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 16142 1727204103.39311: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39347: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39386: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 16142 1727204103.39389: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39412: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39436: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 16142 1727204103.39448: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39503: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39572: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 16142 1727204103.39575: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39598: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39617: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 16142 1727204103.39631: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39680: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39737: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 16142 1727204103.39740: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39793: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.39847: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 16142 1727204103.39850: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.40068: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.40286: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 16142 1727204103.40349: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.40398: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 16142 1727204103.40404: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.40441: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.40486: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 16142 1727204103.41322: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 16142 1727204103.41384: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.41454: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 16142 1727204103.41718: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42070: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 16142 1727204103.42077: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42131: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42194: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 16142 1727204103.42201: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42269: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42332: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 16142 1727204103.42340: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42447: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42561: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 16142 1727204103.42570: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42678: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.42796: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 16142 1727204103.42904: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204103.43255: stdout chunk (state=3): >>>import 'gc' # <<< 16142 1727204103.43759: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca521dee0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca51e0280> <<< 16142 1727204103.44056: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca51e0190> <<< 16142 1727204103.44844: stdout chunk (state=3): >>> <<< 16142 1727204103.44873: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "03", "epoch": "1727204103", "epoch_int": "1727204103", "date": "2024-09-24", "time": "14:55:03", "iso8601_micro": "2024-09-24T18:55:03.431485Z", "iso8601": "2024-09-24T18:55:03Z", "iso8601_basic": "20240924T145503431485", "iso8601_basic_short": "20240924T145503", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distributio<<< 16142 1727204103.44895: stdout chunk (state=3): >>>n_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16142 1727204103.45376: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin <<< 16142 1727204103.45413: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs <<< 16142 1727204103.45455: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves <<< 16142 1727204103.45488: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux <<< 16142 1727204103.45537: stdout chunk (state=3): >>># cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd <<< 16142 1727204103.45573: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd <<< 16142 1727204103.45580: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 16142 1727204103.45819: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 16142 1727204103.45844: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 16142 1727204103.45890: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 16142 1727204103.45898: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 16142 1727204103.45939: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 16142 1727204103.45945: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 16142 1727204103.45974: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 16142 1727204103.46017: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 16142 1727204103.46057: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 16142 1727204103.46076: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 16142 1727204103.46108: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 16142 1727204103.46130: stdout chunk (state=3): >>># destroy shlex <<< 16142 1727204103.46146: stdout chunk (state=3): >>># destroy datetime <<< 16142 1727204103.46176: stdout chunk (state=3): >>># destroy base64 # destroy ansible.module_utils.compat.selinux <<< 16142 1727204103.46203: stdout chunk (state=3): >>># destroy getpass # destroy json <<< 16142 1727204103.46211: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 16142 1727204103.46250: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 16142 1727204103.46292: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 16142 1727204103.46350: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 16142 1727204103.46378: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 16142 1727204103.46406: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types <<< 16142 1727204103.46439: stdout chunk (state=3): >>># cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 16142 1727204103.46445: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 16142 1727204103.46486: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 16142 1727204103.46650: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 16142 1727204103.46689: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 16142 1727204103.46732: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 16142 1727204103.46741: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 16142 1727204103.46782: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 16142 1727204103.47172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204103.47177: stdout chunk (state=3): >>><<< 16142 1727204103.47184: stderr chunk (state=3): >>><<< 16142 1727204103.47328: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6843dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6843b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6843ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca658f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca658f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65b2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca658f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65f0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6588d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65b2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65d8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6552f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65590a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca654c5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65536a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65523d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca643ae80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca643a970> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca643af70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca643adc0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644a130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca652edf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65276d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca653a730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca655aeb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca644ad30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca652e310> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca653a340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6560a60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644af10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644ae50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644adc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca641e430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca641e520> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6453fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644daf0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644d4c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6343280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6409dc0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644df70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca65600d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6354bb0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6354ee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca63667f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6366d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca62f4460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6354fd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6304340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6366670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6304400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644aa90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6320760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6320a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6320820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6320910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6320d60> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca632a2b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca63209a0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6314af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca644a670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6320b50> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdca6243730> # zipimport: found 103 names in '/tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6180160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180fd0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61804f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180df0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca6180580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6180100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bc0070> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b093a0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b090a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5b09d00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6168dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61683a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6168f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61b7e80> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5beed90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bee460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca617dac0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5bee580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bee5b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b74f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61c92b0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b717f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61c9430> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61c9c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5b71790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca61c9100> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca61c95b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca61c9f70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61c2970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b678e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b85df0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5b70520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b67e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5b70940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5b80790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bbf850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca574afd0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bf22e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6186ca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5ba3c40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6186be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5bb6910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca6154b50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca55a57f0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca54a7100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca570ca90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca570ca00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca56dfdc0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca56df790> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca57294c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5729d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca56efee0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca56ef9d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca56ef1f0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5509280> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca61d2a30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca5729070> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca53f9ee0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca53f99d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca5422040> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca57298e0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca546ddf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca546d580> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_a09kntc4/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdca521dee0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca51e0280> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdca51e0190> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "03", "epoch": "1727204103", "epoch_int": "1727204103", "date": "2024-09-24", "time": "14:55:03", "iso8601_micro": "2024-09-24T18:55:03.431485Z", "iso8601": "2024-09-24T18:55:03Z", "iso8601_basic": "20240924T145503431485", "iso8601_basic_short": "20240924T145503", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 16142 1727204103.48602: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204103.48606: _low_level_execute_command(): starting 16142 1727204103.48608: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204102.8670805-16432-261786856753140/ > /dev/null 2>&1 && sleep 0' 16142 1727204103.49772: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204103.49827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.49840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204103.49855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.49901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204103.49931: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204103.49950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.49971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204103.50017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204103.50023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204103.50137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.50801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204103.50814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.50822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204103.50828: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204103.50841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.51077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204103.51082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204103.51085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204103.51116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204103.52949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204103.52955: stdout chunk (state=3): >>><<< 16142 1727204103.52957: stderr chunk (state=3): >>><<< 16142 1727204103.52982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204103.52988: handler run complete 16142 1727204103.53041: variable 'ansible_facts' from source: unknown 16142 1727204103.53093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204103.53196: variable 'ansible_facts' from source: unknown 16142 1727204103.53259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204103.53366: attempt loop complete, returning result 16142 1727204103.53369: _execute() done 16142 1727204103.53372: dumping result to json 16142 1727204103.53385: done dumping result, returning 16142 1727204103.53395: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-fddd-f6c7-0000000001cd] 16142 1727204103.53397: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001cd 16142 1727204103.53563: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001cd 16142 1727204103.53568: WORKER PROCESS EXITING ok: [managed-node2] 16142 1727204103.53666: no more pending results, returning what we have 16142 1727204103.53669: results queue empty 16142 1727204103.53670: checking for any_errors_fatal 16142 1727204103.53672: done checking for any_errors_fatal 16142 1727204103.53672: checking for max_fail_percentage 16142 1727204103.53674: done checking for max_fail_percentage 16142 1727204103.53675: checking to see if all hosts have failed and the running result is not ok 16142 1727204103.53675: done checking to see if all hosts have failed 16142 1727204103.53676: getting the remaining hosts for this loop 16142 1727204103.53677: done getting the remaining hosts for this loop 16142 1727204103.53682: getting the next task for host managed-node2 16142 1727204103.53691: done getting next task for host managed-node2 16142 1727204103.53694: ^ task is: TASK: Check if system is ostree 16142 1727204103.53696: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204103.53700: getting variables 16142 1727204103.53702: in VariableManager get_vars() 16142 1727204103.53732: Calling all_inventory to load vars for managed-node2 16142 1727204103.53735: Calling groups_inventory to load vars for managed-node2 16142 1727204103.53738: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204103.53750: Calling all_plugins_play to load vars for managed-node2 16142 1727204103.53753: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204103.53756: Calling groups_plugins_play to load vars for managed-node2 16142 1727204103.53934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204103.54122: done with get_vars() 16142 1727204103.54133: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.835) 0:00:02.719 ***** 16142 1727204103.54235: entering _queue_task() for managed-node2/stat 16142 1727204103.54714: worker is 1 (out of 1 available) 16142 1727204103.54727: exiting _queue_task() for managed-node2/stat 16142 1727204103.54739: done queuing things up, now waiting for results queue to drain 16142 1727204103.54741: waiting for pending results... 16142 1727204103.55676: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 16142 1727204103.55803: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001cf 16142 1727204103.55977: variable 'ansible_search_path' from source: unknown 16142 1727204103.55985: variable 'ansible_search_path' from source: unknown 16142 1727204103.56027: calling self._execute() 16142 1727204103.56218: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204103.56229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204103.56247: variable 'omit' from source: magic vars 16142 1727204103.57125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204103.57777: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204103.57917: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204103.57966: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204103.58006: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204103.58124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204103.58281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204103.58389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204103.58420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204103.58713: Evaluated conditional (not __network_is_ostree is defined): True 16142 1727204103.58802: variable 'omit' from source: magic vars 16142 1727204103.58851: variable 'omit' from source: magic vars 16142 1727204103.58943: variable 'omit' from source: magic vars 16142 1727204103.59039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204103.59089: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204103.59147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204103.59213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204103.59269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204103.59803: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204103.59888: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204103.59896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204103.60086: Set connection var ansible_timeout to 10 16142 1727204103.60215: Set connection var ansible_connection to ssh 16142 1727204103.60227: Set connection var ansible_shell_type to sh 16142 1727204103.60241: Set connection var ansible_shell_executable to /bin/sh 16142 1727204103.60251: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204103.60262: Set connection var ansible_pipelining to False 16142 1727204103.60295: variable 'ansible_shell_executable' from source: unknown 16142 1727204103.60385: variable 'ansible_connection' from source: unknown 16142 1727204103.60396: variable 'ansible_module_compression' from source: unknown 16142 1727204103.60434: variable 'ansible_shell_type' from source: unknown 16142 1727204103.60442: variable 'ansible_shell_executable' from source: unknown 16142 1727204103.60450: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204103.60457: variable 'ansible_pipelining' from source: unknown 16142 1727204103.60465: variable 'ansible_timeout' from source: unknown 16142 1727204103.60474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204103.60851: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204103.60984: variable 'omit' from source: magic vars 16142 1727204103.60995: starting attempt loop 16142 1727204103.61002: running the handler 16142 1727204103.61021: _low_level_execute_command(): starting 16142 1727204103.61092: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204103.62122: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204103.62149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.62168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204103.62192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.62240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204103.62256: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204103.62275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.62300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204103.62313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204103.62324: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204103.62340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.62359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204103.62385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.62403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204103.62416: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204103.62434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.62520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204103.62548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204103.62568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204103.62674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204103.64937: stdout chunk (state=3): >>>/root <<< 16142 1727204103.65094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204103.65210: stderr chunk (state=3): >>><<< 16142 1727204103.65215: stdout chunk (state=3): >>><<< 16142 1727204103.65287: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204103.65298: _low_level_execute_command(): starting 16142 1727204103.65301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446 `" && echo ansible-tmp-1727204103.652405-16464-2373631671446="` echo /root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446 `" ) && sleep 0' 16142 1727204103.66899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204103.66904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204103.66996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.67000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204103.67061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204103.67199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204103.67202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204103.67302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204103.69992: stdout chunk (state=3): >>>ansible-tmp-1727204103.652405-16464-2373631671446=/root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446 <<< 16142 1727204103.70159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204103.70240: stderr chunk (state=3): >>><<< 16142 1727204103.70245: stdout chunk (state=3): >>><<< 16142 1727204103.70494: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204103.652405-16464-2373631671446=/root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204103.70497: variable 'ansible_module_compression' from source: unknown 16142 1727204103.70500: ANSIBALLZ: Using lock for stat 16142 1727204103.70502: ANSIBALLZ: Acquiring lock 16142 1727204103.70504: ANSIBALLZ: Lock acquired: 140089295715840 16142 1727204103.70506: ANSIBALLZ: Creating module 16142 1727204104.05170: ANSIBALLZ: Writing module into payload 16142 1727204104.05413: ANSIBALLZ: Writing module 16142 1727204104.05485: ANSIBALLZ: Renaming module 16142 1727204104.05559: ANSIBALLZ: Done creating module 16142 1727204104.05587: variable 'ansible_facts' from source: unknown 16142 1727204104.05789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446/AnsiballZ_stat.py 16142 1727204104.07078: Sending initial data 16142 1727204104.07082: Sent initial data (150 bytes) 16142 1727204104.10241: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204104.10411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204104.10415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204104.10452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204104.10456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204104.10458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204104.10752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204104.10756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204104.10828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204104.12697: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204104.12744: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204104.12804: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp2x3b7syn /root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446/AnsiballZ_stat.py <<< 16142 1727204104.13129: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204104.14245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204104.14249: stderr chunk (state=3): >>><<< 16142 1727204104.14258: stdout chunk (state=3): >>><<< 16142 1727204104.14278: done transferring module to remote 16142 1727204104.14294: _low_level_execute_command(): starting 16142 1727204104.14299: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446/ /root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446/AnsiballZ_stat.py && sleep 0' 16142 1727204104.15867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204104.15872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204104.16037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204104.16093: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204104.16119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204104.16143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204104.16155: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204104.16169: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204104.16182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204104.16196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204104.16211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204104.16233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204104.16347: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204104.16366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204104.16461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204104.16488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204104.16507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204104.16592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204104.18580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204104.18584: stdout chunk (state=3): >>><<< 16142 1727204104.18587: stderr chunk (state=3): >>><<< 16142 1727204104.18701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204104.18705: _low_level_execute_command(): starting 16142 1727204104.18708: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446/AnsiballZ_stat.py && sleep 0' 16142 1727204104.20891: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204104.20929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204104.20951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204104.20977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204104.21037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204104.21050: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204104.21154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204104.21194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204104.21233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204104.21304: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204104.21316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204104.21329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204104.21347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204104.21357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204104.21373: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204104.21387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204104.21471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204104.21499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204104.21517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204104.21602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204104.23599: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 16142 1727204104.23654: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 16142 1727204104.23715: stdout chunk (state=3): >>>import 'posix' # <<< 16142 1727204104.23718: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 16142 1727204104.23756: stdout chunk (state=3): >>>import 'time' # <<< 16142 1727204104.23759: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 16142 1727204104.23817: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204104.23853: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 16142 1727204104.23866: stdout chunk (state=3): >>>import '_codecs' # <<< 16142 1727204104.23933: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c91edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 16142 1727204104.23966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c91eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 16142 1727204104.24309: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c91eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c87a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 16142 1727204104.24411: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c87a220> <<< 16142 1727204104.24444: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 16142 1727204104.24480: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c89d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c87a940> <<< 16142 1727204104.24555: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8db880> <<< 16142 1727204104.24558: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c873d90> <<< 16142 1727204104.24642: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 16142 1727204104.24645: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c89dd90> <<< 16142 1727204104.24704: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c3970> <<< 16142 1727204104.24744: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 16142 1727204104.25051: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 16142 1727204104.25100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 16142 1727204104.25158: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 16142 1727204104.25178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 16142 1727204104.25212: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 16142 1727204104.25254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 16142 1727204104.25315: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 16142 1727204104.25351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 16142 1727204104.25367: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c83ff10> <<< 16142 1727204104.25463: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8440a0> <<< 16142 1727204104.25543: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 16142 1727204104.25583: stdout chunk (state=3): >>>import '_sre' # <<< 16142 1727204104.25619: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 16142 1727204104.25659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 16142 1727204104.25747: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 16142 1727204104.25827: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8375b0> <<< 16142 1727204104.25833: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c83e6a0> <<< 16142 1727204104.25845: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c83f3d0> <<< 16142 1727204104.25894: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 16142 1727204104.25990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 16142 1727204104.26037: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 16142 1727204104.26092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204104.26152: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 16142 1727204104.26169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 16142 1727204104.26237: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204104.26305: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c596e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c596970><<< 16142 1727204104.26313: stdout chunk (state=3): >>> <<< 16142 1727204104.26392: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c596f70><<< 16142 1727204104.26395: stdout chunk (state=3): >>> <<< 16142 1727204104.26437: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 16142 1727204104.26474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 16142 1727204104.26546: stdout chunk (state=3): >>>import '_operator' # <<< 16142 1727204104.26592: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c596dc0> <<< 16142 1727204104.26652: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 16142 1727204104.26655: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6130> <<< 16142 1727204104.26676: stdout chunk (state=3): >>>import '_collections' # <<< 16142 1727204104.26785: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5eedf0> <<< 16142 1727204104.26789: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5e76d0> <<< 16142 1727204104.26841: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5fa730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c845e80> <<< 16142 1727204104.26892: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 16142 1727204104.26932: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c5a6d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5ee310> <<< 16142 1727204104.26962: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c5fa340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c84ba30> <<< 16142 1727204104.26988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 16142 1727204104.27065: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204104.27121: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6e50> <<< 16142 1727204104.27147: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6dc0> <<< 16142 1727204104.27189: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 16142 1727204104.27192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 16142 1727204104.27241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 16142 1727204104.27282: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c57a430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 16142 1727204104.27286: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c57a520> <<< 16142 1727204104.27400: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5aefa0> <<< 16142 1727204104.27474: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a9af0> <<< 16142 1727204104.27497: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a94c0> <<< 16142 1727204104.27500: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 16142 1727204104.27545: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 16142 1727204104.27562: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4ae280> <<< 16142 1727204104.27596: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c565dc0> <<< 16142 1727204104.27641: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a9f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c84b0a0> <<< 16142 1727204104.27724: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4bfbb0> import 'errno' # <<< 16142 1727204104.27777: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c4bfee0> <<< 16142 1727204104.27818: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 16142 1727204104.27845: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4d17f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 16142 1727204104.27870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 16142 1727204104.27900: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4d1d30> <<< 16142 1727204104.27955: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c46a460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4bffd0> <<< 16142 1727204104.27970: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 16142 1727204104.28023: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c47a340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4d1670> import 'pwd' # <<< 16142 1727204104.28058: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c47a400> <<< 16142 1727204104.28102: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6a90> <<< 16142 1727204104.28139: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 16142 1727204104.28168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 16142 1727204104.28196: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c496760> <<< 16142 1727204104.28236: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 16142 1727204104.28274: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c496a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c496820> <<< 16142 1727204104.28305: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c496910> <<< 16142 1727204104.28317: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 16142 1727204104.28497: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c496d60> <<< 16142 1727204104.28542: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c4a02b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4969a0> <<< 16142 1727204104.28564: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c48aaf0> <<< 16142 1727204104.28590: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 16142 1727204104.28646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 16142 1727204104.28687: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c496b50> <<< 16142 1727204104.28786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 16142 1727204104.28800: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f965c3b9730> <<< 16142 1727204104.28937: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip' # zipimport: zlib available <<< 16142 1727204104.29025: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.29083: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 16142 1727204104.29093: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.29113: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 16142 1727204104.31126: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.32726: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 16142 1727204104.32733: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96880> <<< 16142 1727204104.32788: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204104.32834: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 16142 1727204104.32849: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 16142 1727204104.32921: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd96160> <<< 16142 1727204104.32984: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96280> <<< 16142 1727204104.33059: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96fd0> <<< 16142 1727204104.33097: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc'<<< 16142 1727204104.33100: stdout chunk (state=3): >>> <<< 16142 1727204104.33193: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd964f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96df0> <<< 16142 1727204104.33197: stdout chunk (state=3): >>>import 'atexit' # <<< 16142 1727204104.33258: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd96580> <<< 16142 1727204104.33287: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 16142 1727204104.33340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 16142 1727204104.33404: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96100> <<< 16142 1727204104.33458: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 16142 1727204104.33495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 16142 1727204104.33509: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 16142 1727204104.33545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 16142 1727204104.33599: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 16142 1727204104.33602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 16142 1727204104.33723: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bcedfa0> <<< 16142 1727204104.33804: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204104.33807: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd0bc70> <<< 16142 1727204104.33858: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204104.33895: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd0bf70> <<< 16142 1727204104.33909: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 16142 1727204104.33959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 16142 1727204104.34013: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd0b310> <<< 16142 1727204104.34048: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdfedc0> <<< 16142 1727204104.34329: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdfe3a0> <<< 16142 1727204104.34378: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 16142 1727204104.34417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 16142 1727204104.34420: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdfef40> <<< 16142 1727204104.34465: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 16142 1727204104.34479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 16142 1727204104.34530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 16142 1727204104.34583: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 16142 1727204104.34630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 16142 1727204104.34661: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdcde80> <<< 16142 1727204104.34805: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd69d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd69460> <<< 16142 1727204104.34822: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bda0550> <<< 16142 1727204104.34892: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204104.34895: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd69580> <<< 16142 1727204104.34960: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd695b0> <<< 16142 1727204104.35044: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 16142 1727204104.35048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 16142 1727204104.35070: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 16142 1727204104.35108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 16142 1727204104.35267: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcdef70><<< 16142 1727204104.35271: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdde2b0> <<< 16142 1727204104.35310: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 16142 1727204104.35426: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcdb7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdde430><<< 16142 1727204104.35449: stdout chunk (state=3): >>> <<< 16142 1727204104.35461: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 16142 1727204104.35516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204104.35591: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 16142 1727204104.35687: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdf6e80> <<< 16142 1727204104.35905: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bcdb790> <<< 16142 1727204104.36065: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204104.36069: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcdb5e0> <<< 16142 1727204104.36133: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204104.36136: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcda550> <<< 16142 1727204104.36229: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204104.36234: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcda490> <<< 16142 1727204104.36278: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdd5970> <<< 16142 1727204104.36293: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 16142 1727204104.36320: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 16142 1727204104.36356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 16142 1727204104.36441: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd5f6a0> <<< 16142 1727204104.36756: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 16142 1727204104.36792: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd5eb80> <<< 16142 1727204104.36795: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd6e0a0> <<< 16142 1727204104.36888: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd5f100><<< 16142 1727204104.36892: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bda2be0> <<< 16142 1727204104.36940: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.36969: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.36989: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 16142 1727204104.36991: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.37083: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.37213: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 16142 1727204104.37244: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 16142 1727204104.37389: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.37537: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.38290: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.39081: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 16142 1727204104.39103: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204104.39162: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bca6ac0> <<< 16142 1727204104.39256: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd5cd00> <<< 16142 1727204104.39275: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd53850> <<< 16142 1727204104.39354: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 16142 1727204104.39378: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 16142 1727204104.39559: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.39754: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 16142 1727204104.39797: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd5e9d0> <<< 16142 1727204104.39800: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.40420: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.41042: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.41103: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.41192: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 16142 1727204104.41238: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.41290: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 16142 1727204104.41293: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.41369: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.41498: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 16142 1727204104.41515: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 16142 1727204104.41518: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.41550: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.41601: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 16142 1727204104.41891: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.42185: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 16142 1727204104.42227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 16142 1727204104.42336: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965b8d0310> # zipimport: zlib available <<< 16142 1727204104.42434: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.42537: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 16142 1727204104.42541: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 16142 1727204104.42554: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.42597: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.42651: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 16142 1727204104.42655: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.42695: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.42742: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.42856: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.42945: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 16142 1727204104.42976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 16142 1727204104.43073: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bde72b0> <<< 16142 1727204104.43112: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd5c7c0> <<< 16142 1727204104.43156: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 16142 1727204104.43349: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.43415: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.43456: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.43515: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 16142 1727204104.43519: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 16142 1727204104.43535: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 16142 1727204104.43595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 16142 1727204104.43610: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 16142 1727204104.43747: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965b8b3760> <<< 16142 1727204104.43807: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bc9d610> <<< 16142 1727204104.43881: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bc9cb80> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 16142 1727204104.43932: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.43947: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 16142 1727204104.44069: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available <<< 16142 1727204104.44086: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 16142 1727204104.44089: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.44245: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.44499: stdout chunk (state=3): >>># zipimport: zlib available <<< 16142 1727204104.44706: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 16142 1727204104.44726: stdout chunk (state=3): >>># destroy __main__ <<< 16142 1727204104.45293: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random <<< 16142 1727204104.45312: stdout chunk (state=3): >>># destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 16142 1727204104.45583: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 16142 1727204104.45597: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 16142 1727204104.45695: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 16142 1727204104.45710: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 16142 1727204104.45996: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 16142 1727204104.46000: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 16142 1727204104.46242: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 16142 1727204104.46259: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 16142 1727204104.46261: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 16142 1727204104.46297: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 16142 1727204104.46797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204104.46800: stdout chunk (state=3): >>><<< 16142 1727204104.46802: stderr chunk (state=3): >>><<< 16142 1727204104.46971: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c91edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c91eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c91eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c87a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c87a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c89d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c87a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8db880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c873d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c89dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8c3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c83ff10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8440a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c8375b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c83e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c83f3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c596e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c596970> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c596f70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c596dc0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5eedf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5e76d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5fa730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c845e80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c5a6d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5ee310> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c5fa340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c84ba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c57a430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c57a520> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5aefa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a9af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a94c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4ae280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c565dc0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a9f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c84b0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4bfbb0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c4bfee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4d17f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4d1d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c46a460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4bffd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c47a340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4d1670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c47a400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6a90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c496760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c496a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c496820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c496910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c496d60> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965c4a02b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c4969a0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c48aaf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c5a6670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965c496b50> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f965c3b9730> # zipimport: found 30 names in '/tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd96160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96fd0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd964f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96df0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd96580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd96100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bcedfa0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd0bc70> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd0bf70> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd0b310> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdfedc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdfe3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdfef40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdcde80> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd69d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd69460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bda0550> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd69580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd695b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcdef70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdde2b0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcdb7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdde430> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdf6e80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bcdb790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcdb5e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcda550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bcda490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bdd5970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd5f6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd5eb80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd6e0a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bd5f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bda2be0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bca6ac0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd5cd00> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd53850> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd5e9d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965b8d0310> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f965bde72b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bd5c7c0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965b8b3760> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bc9d610> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f965bc9cb80> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_26i29wry/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 16142 1727204104.47579: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204104.47587: _low_level_execute_command(): starting 16142 1727204104.47591: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204103.652405-16464-2373631671446/ > /dev/null 2>&1 && sleep 0' 16142 1727204104.49709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204104.49729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204104.49747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204104.49771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204104.49818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204104.49835: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204104.49852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204104.49873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204104.49892: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204104.49903: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204104.49916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204104.49930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204104.49950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204104.49962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204104.49976: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204104.49996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204104.50078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204104.50109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204104.50128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204104.50212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204104.52905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204104.52910: stdout chunk (state=3): >>><<< 16142 1727204104.52913: stderr chunk (state=3): >>><<< 16142 1727204104.53216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204104.53220: handler run complete 16142 1727204104.53223: attempt loop complete, returning result 16142 1727204104.53226: _execute() done 16142 1727204104.53229: dumping result to json 16142 1727204104.53234: done dumping result, returning 16142 1727204104.53237: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [0affcd87-79f5-fddd-f6c7-0000000001cf] 16142 1727204104.53240: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001cf 16142 1727204104.53321: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001cf 16142 1727204104.53325: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16142 1727204104.53608: no more pending results, returning what we have 16142 1727204104.53614: results queue empty 16142 1727204104.53615: checking for any_errors_fatal 16142 1727204104.53622: done checking for any_errors_fatal 16142 1727204104.53623: checking for max_fail_percentage 16142 1727204104.53625: done checking for max_fail_percentage 16142 1727204104.53625: checking to see if all hosts have failed and the running result is not ok 16142 1727204104.53626: done checking to see if all hosts have failed 16142 1727204104.53627: getting the remaining hosts for this loop 16142 1727204104.53628: done getting the remaining hosts for this loop 16142 1727204104.53632: getting the next task for host managed-node2 16142 1727204104.53637: done getting next task for host managed-node2 16142 1727204104.53639: ^ task is: TASK: Set flag to indicate system is ostree 16142 1727204104.53642: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204104.53645: getting variables 16142 1727204104.53646: in VariableManager get_vars() 16142 1727204104.53677: Calling all_inventory to load vars for managed-node2 16142 1727204104.53680: Calling groups_inventory to load vars for managed-node2 16142 1727204104.53683: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204104.53693: Calling all_plugins_play to load vars for managed-node2 16142 1727204104.53696: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204104.53699: Calling groups_plugins_play to load vars for managed-node2 16142 1727204104.53946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204104.55310: done with get_vars() 16142 1727204104.55325: done getting variables 16142 1727204104.55450: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:55:04 -0400 (0:00:01.012) 0:00:03.731 ***** 16142 1727204104.55490: entering _queue_task() for managed-node2/set_fact 16142 1727204104.55492: Creating lock for set_fact 16142 1727204104.55809: worker is 1 (out of 1 available) 16142 1727204104.55821: exiting _queue_task() for managed-node2/set_fact 16142 1727204104.55835: done queuing things up, now waiting for results queue to drain 16142 1727204104.55836: waiting for pending results... 16142 1727204104.56297: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 16142 1727204104.56404: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001d0 16142 1727204104.56423: variable 'ansible_search_path' from source: unknown 16142 1727204104.56431: variable 'ansible_search_path' from source: unknown 16142 1727204104.56482: calling self._execute() 16142 1727204104.56576: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.56592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.56607: variable 'omit' from source: magic vars 16142 1727204104.57361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204104.58069: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204104.58249: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204104.58492: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204104.58598: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204104.58842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204104.58884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204104.58917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204104.58972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204104.59115: Evaluated conditional (not __network_is_ostree is defined): True 16142 1727204104.59129: variable 'omit' from source: magic vars 16142 1727204104.59181: variable 'omit' from source: magic vars 16142 1727204104.59326: variable '__ostree_booted_stat' from source: set_fact 16142 1727204104.59390: variable 'omit' from source: magic vars 16142 1727204104.59425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204104.59459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204104.59489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204104.59510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204104.59528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204104.59568: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204104.59582: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.59591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.59707: Set connection var ansible_timeout to 10 16142 1727204104.59715: Set connection var ansible_connection to ssh 16142 1727204104.59726: Set connection var ansible_shell_type to sh 16142 1727204104.59736: Set connection var ansible_shell_executable to /bin/sh 16142 1727204104.59747: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204104.59763: Set connection var ansible_pipelining to False 16142 1727204104.59795: variable 'ansible_shell_executable' from source: unknown 16142 1727204104.59806: variable 'ansible_connection' from source: unknown 16142 1727204104.59813: variable 'ansible_module_compression' from source: unknown 16142 1727204104.59820: variable 'ansible_shell_type' from source: unknown 16142 1727204104.59826: variable 'ansible_shell_executable' from source: unknown 16142 1727204104.59835: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.59843: variable 'ansible_pipelining' from source: unknown 16142 1727204104.59850: variable 'ansible_timeout' from source: unknown 16142 1727204104.59859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.59974: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204104.59991: variable 'omit' from source: magic vars 16142 1727204104.60001: starting attempt loop 16142 1727204104.60007: running the handler 16142 1727204104.60030: handler run complete 16142 1727204104.60045: attempt loop complete, returning result 16142 1727204104.60051: _execute() done 16142 1727204104.60058: dumping result to json 16142 1727204104.60067: done dumping result, returning 16142 1727204104.60083: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [0affcd87-79f5-fddd-f6c7-0000000001d0] 16142 1727204104.60094: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001d0 ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 16142 1727204104.60246: no more pending results, returning what we have 16142 1727204104.60249: results queue empty 16142 1727204104.60250: checking for any_errors_fatal 16142 1727204104.60255: done checking for any_errors_fatal 16142 1727204104.60256: checking for max_fail_percentage 16142 1727204104.60258: done checking for max_fail_percentage 16142 1727204104.60258: checking to see if all hosts have failed and the running result is not ok 16142 1727204104.60259: done checking to see if all hosts have failed 16142 1727204104.60259: getting the remaining hosts for this loop 16142 1727204104.60261: done getting the remaining hosts for this loop 16142 1727204104.60266: getting the next task for host managed-node2 16142 1727204104.60276: done getting next task for host managed-node2 16142 1727204104.60279: ^ task is: TASK: Fix CentOS6 Base repo 16142 1727204104.60281: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204104.60284: getting variables 16142 1727204104.60286: in VariableManager get_vars() 16142 1727204104.60315: Calling all_inventory to load vars for managed-node2 16142 1727204104.60318: Calling groups_inventory to load vars for managed-node2 16142 1727204104.60321: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204104.60335: Calling all_plugins_play to load vars for managed-node2 16142 1727204104.60338: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204104.60341: Calling groups_plugins_play to load vars for managed-node2 16142 1727204104.60555: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001d0 16142 1727204104.60578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204104.60897: done with get_vars() 16142 1727204104.60908: done getting variables 16142 1727204104.61281: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.058) 0:00:03.789 ***** 16142 1727204104.61314: entering _queue_task() for managed-node2/copy 16142 1727204104.61927: worker is 1 (out of 1 available) 16142 1727204104.62022: exiting _queue_task() for managed-node2/copy 16142 1727204104.62039: done queuing things up, now waiting for results queue to drain 16142 1727204104.62044: waiting for pending results... 16142 1727204104.62594: WORKER PROCESS EXITING 16142 1727204104.62665: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 16142 1727204104.63077: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001d2 16142 1727204104.63081: variable 'ansible_search_path' from source: unknown 16142 1727204104.63088: variable 'ansible_search_path' from source: unknown 16142 1727204104.63093: calling self._execute() 16142 1727204104.63273: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.63291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.63370: variable 'omit' from source: magic vars 16142 1727204104.64220: variable 'ansible_distribution' from source: facts 16142 1727204104.64257: Evaluated conditional (ansible_distribution == 'CentOS'): True 16142 1727204104.64398: variable 'ansible_distribution_major_version' from source: facts 16142 1727204104.64411: Evaluated conditional (ansible_distribution_major_version == '6'): False 16142 1727204104.64420: when evaluation is False, skipping this task 16142 1727204104.64427: _execute() done 16142 1727204104.64437: dumping result to json 16142 1727204104.64444: done dumping result, returning 16142 1727204104.64454: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [0affcd87-79f5-fddd-f6c7-0000000001d2] 16142 1727204104.64469: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001d2 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 16142 1727204104.64669: no more pending results, returning what we have 16142 1727204104.64673: results queue empty 16142 1727204104.64674: checking for any_errors_fatal 16142 1727204104.64677: done checking for any_errors_fatal 16142 1727204104.64678: checking for max_fail_percentage 16142 1727204104.64680: done checking for max_fail_percentage 16142 1727204104.64681: checking to see if all hosts have failed and the running result is not ok 16142 1727204104.64681: done checking to see if all hosts have failed 16142 1727204104.64682: getting the remaining hosts for this loop 16142 1727204104.64683: done getting the remaining hosts for this loop 16142 1727204104.64687: getting the next task for host managed-node2 16142 1727204104.64694: done getting next task for host managed-node2 16142 1727204104.64697: ^ task is: TASK: Include the task 'enable_epel.yml' 16142 1727204104.64701: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204104.64704: getting variables 16142 1727204104.64706: in VariableManager get_vars() 16142 1727204104.64738: Calling all_inventory to load vars for managed-node2 16142 1727204104.64741: Calling groups_inventory to load vars for managed-node2 16142 1727204104.64744: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204104.64757: Calling all_plugins_play to load vars for managed-node2 16142 1727204104.64761: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204104.64765: Calling groups_plugins_play to load vars for managed-node2 16142 1727204104.64949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204104.65159: done with get_vars() 16142 1727204104.65172: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.039) 0:00:03.829 ***** 16142 1727204104.65409: entering _queue_task() for managed-node2/include_tasks 16142 1727204104.65423: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001d2 16142 1727204104.65426: WORKER PROCESS EXITING 16142 1727204104.66177: worker is 1 (out of 1 available) 16142 1727204104.66195: exiting _queue_task() for managed-node2/include_tasks 16142 1727204104.66206: done queuing things up, now waiting for results queue to drain 16142 1727204104.66208: waiting for pending results... 16142 1727204104.67039: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 16142 1727204104.67272: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001d3 16142 1727204104.67486: variable 'ansible_search_path' from source: unknown 16142 1727204104.67505: variable 'ansible_search_path' from source: unknown 16142 1727204104.67554: calling self._execute() 16142 1727204104.67724: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.67736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.67749: variable 'omit' from source: magic vars 16142 1727204104.68201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204104.75118: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204104.75277: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204104.75366: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204104.75471: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204104.75567: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204104.75725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204104.75888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204104.75921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204104.76014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204104.76179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204104.76345: variable '__network_is_ostree' from source: set_fact 16142 1727204104.76430: Evaluated conditional (not __network_is_ostree | d(false)): True 16142 1727204104.76514: _execute() done 16142 1727204104.76528: dumping result to json 16142 1727204104.76540: done dumping result, returning 16142 1727204104.76551: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-fddd-f6c7-0000000001d3] 16142 1727204104.76562: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001d3 16142 1727204104.76693: no more pending results, returning what we have 16142 1727204104.76699: in VariableManager get_vars() 16142 1727204104.76735: Calling all_inventory to load vars for managed-node2 16142 1727204104.76738: Calling groups_inventory to load vars for managed-node2 16142 1727204104.76742: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204104.76752: Calling all_plugins_play to load vars for managed-node2 16142 1727204104.76755: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204104.76758: Calling groups_plugins_play to load vars for managed-node2 16142 1727204104.76989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204104.77216: done with get_vars() 16142 1727204104.77224: variable 'ansible_search_path' from source: unknown 16142 1727204104.77225: variable 'ansible_search_path' from source: unknown 16142 1727204104.77270: we have included files to process 16142 1727204104.77271: generating all_blocks data 16142 1727204104.77273: done generating all_blocks data 16142 1727204104.77278: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 16142 1727204104.77280: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 16142 1727204104.77282: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 16142 1727204104.78039: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001d3 16142 1727204104.78043: WORKER PROCESS EXITING 16142 1727204104.79312: done processing included file 16142 1727204104.79315: iterating over new_blocks loaded from include file 16142 1727204104.79317: in VariableManager get_vars() 16142 1727204104.79330: done with get_vars() 16142 1727204104.79334: filtering new block on tags 16142 1727204104.79360: done filtering new block on tags 16142 1727204104.79484: in VariableManager get_vars() 16142 1727204104.79497: done with get_vars() 16142 1727204104.79498: filtering new block on tags 16142 1727204104.79510: done filtering new block on tags 16142 1727204104.79512: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 16142 1727204104.79518: extending task lists for all hosts with included blocks 16142 1727204104.79724: done extending task lists 16142 1727204104.79725: done processing included files 16142 1727204104.79726: results queue empty 16142 1727204104.79727: checking for any_errors_fatal 16142 1727204104.79730: done checking for any_errors_fatal 16142 1727204104.79732: checking for max_fail_percentage 16142 1727204104.79733: done checking for max_fail_percentage 16142 1727204104.79734: checking to see if all hosts have failed and the running result is not ok 16142 1727204104.79734: done checking to see if all hosts have failed 16142 1727204104.79735: getting the remaining hosts for this loop 16142 1727204104.79736: done getting the remaining hosts for this loop 16142 1727204104.79738: getting the next task for host managed-node2 16142 1727204104.79742: done getting next task for host managed-node2 16142 1727204104.79744: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 16142 1727204104.79747: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204104.79748: getting variables 16142 1727204104.79749: in VariableManager get_vars() 16142 1727204104.79757: Calling all_inventory to load vars for managed-node2 16142 1727204104.79759: Calling groups_inventory to load vars for managed-node2 16142 1727204104.79761: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204104.79769: Calling all_plugins_play to load vars for managed-node2 16142 1727204104.79776: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204104.79778: Calling groups_plugins_play to load vars for managed-node2 16142 1727204104.80612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204104.81040: done with get_vars() 16142 1727204104.81050: done getting variables 16142 1727204104.81248: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 16142 1727204104.81488: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.162) 0:00:03.992 ***** 16142 1727204104.81539: entering _queue_task() for managed-node2/command 16142 1727204104.81541: Creating lock for command 16142 1727204104.82327: worker is 1 (out of 1 available) 16142 1727204104.82341: exiting _queue_task() for managed-node2/command 16142 1727204104.82352: done queuing things up, now waiting for results queue to drain 16142 1727204104.82353: waiting for pending results... 16142 1727204104.83104: running TaskExecutor() for managed-node2/TASK: Create EPEL 9 16142 1727204104.83338: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001ed 16142 1727204104.83406: variable 'ansible_search_path' from source: unknown 16142 1727204104.83413: variable 'ansible_search_path' from source: unknown 16142 1727204104.83457: calling self._execute() 16142 1727204104.83583: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.83679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.83726: variable 'omit' from source: magic vars 16142 1727204104.84496: variable 'ansible_distribution' from source: facts 16142 1727204104.84584: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 16142 1727204104.84868: variable 'ansible_distribution_major_version' from source: facts 16142 1727204104.84921: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 16142 1727204104.84924: when evaluation is False, skipping this task 16142 1727204104.84927: _execute() done 16142 1727204104.84933: dumping result to json 16142 1727204104.84936: done dumping result, returning 16142 1727204104.84939: done running TaskExecutor() for managed-node2/TASK: Create EPEL 9 [0affcd87-79f5-fddd-f6c7-0000000001ed] 16142 1727204104.84945: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001ed skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 16142 1727204104.85193: no more pending results, returning what we have 16142 1727204104.85197: results queue empty 16142 1727204104.85198: checking for any_errors_fatal 16142 1727204104.85200: done checking for any_errors_fatal 16142 1727204104.85200: checking for max_fail_percentage 16142 1727204104.85203: done checking for max_fail_percentage 16142 1727204104.85204: checking to see if all hosts have failed and the running result is not ok 16142 1727204104.85204: done checking to see if all hosts have failed 16142 1727204104.85205: getting the remaining hosts for this loop 16142 1727204104.85206: done getting the remaining hosts for this loop 16142 1727204104.85211: getting the next task for host managed-node2 16142 1727204104.85218: done getting next task for host managed-node2 16142 1727204104.85221: ^ task is: TASK: Install yum-utils package 16142 1727204104.85224: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204104.85232: getting variables 16142 1727204104.85234: in VariableManager get_vars() 16142 1727204104.85267: Calling all_inventory to load vars for managed-node2 16142 1727204104.85271: Calling groups_inventory to load vars for managed-node2 16142 1727204104.85276: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204104.85292: Calling all_plugins_play to load vars for managed-node2 16142 1727204104.85296: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204104.85300: Calling groups_plugins_play to load vars for managed-node2 16142 1727204104.85477: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001ed 16142 1727204104.85480: WORKER PROCESS EXITING 16142 1727204104.85498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204104.85747: done with get_vars() 16142 1727204104.85760: done getting variables 16142 1727204104.85993: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.046) 0:00:04.038 ***** 16142 1727204104.86148: entering _queue_task() for managed-node2/package 16142 1727204104.86150: Creating lock for package 16142 1727204104.86815: worker is 1 (out of 1 available) 16142 1727204104.86828: exiting _queue_task() for managed-node2/package 16142 1727204104.86842: done queuing things up, now waiting for results queue to drain 16142 1727204104.86843: waiting for pending results... 16142 1727204104.87570: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 16142 1727204104.87801: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001ee 16142 1727204104.87821: variable 'ansible_search_path' from source: unknown 16142 1727204104.87829: variable 'ansible_search_path' from source: unknown 16142 1727204104.87882: calling self._execute() 16142 1727204104.88100: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.88113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.88133: variable 'omit' from source: magic vars 16142 1727204104.89031: variable 'ansible_distribution' from source: facts 16142 1727204104.89053: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 16142 1727204104.89358: variable 'ansible_distribution_major_version' from source: facts 16142 1727204104.89373: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 16142 1727204104.89381: when evaluation is False, skipping this task 16142 1727204104.89387: _execute() done 16142 1727204104.89393: dumping result to json 16142 1727204104.89447: done dumping result, returning 16142 1727204104.89459: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [0affcd87-79f5-fddd-f6c7-0000000001ee] 16142 1727204104.89473: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001ee skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 16142 1727204104.89637: no more pending results, returning what we have 16142 1727204104.89641: results queue empty 16142 1727204104.89642: checking for any_errors_fatal 16142 1727204104.89650: done checking for any_errors_fatal 16142 1727204104.89651: checking for max_fail_percentage 16142 1727204104.89653: done checking for max_fail_percentage 16142 1727204104.89653: checking to see if all hosts have failed and the running result is not ok 16142 1727204104.89654: done checking to see if all hosts have failed 16142 1727204104.89655: getting the remaining hosts for this loop 16142 1727204104.89656: done getting the remaining hosts for this loop 16142 1727204104.89661: getting the next task for host managed-node2 16142 1727204104.89671: done getting next task for host managed-node2 16142 1727204104.89675: ^ task is: TASK: Enable EPEL 7 16142 1727204104.89679: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204104.89682: getting variables 16142 1727204104.89685: in VariableManager get_vars() 16142 1727204104.89720: Calling all_inventory to load vars for managed-node2 16142 1727204104.89723: Calling groups_inventory to load vars for managed-node2 16142 1727204104.89727: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204104.89744: Calling all_plugins_play to load vars for managed-node2 16142 1727204104.89747: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204104.89751: Calling groups_plugins_play to load vars for managed-node2 16142 1727204104.89923: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001ee 16142 1727204104.89928: WORKER PROCESS EXITING 16142 1727204104.89946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204104.90160: done with get_vars() 16142 1727204104.90174: done getting variables 16142 1727204104.90236: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.042) 0:00:04.080 ***** 16142 1727204104.90390: entering _queue_task() for managed-node2/command 16142 1727204104.91136: worker is 1 (out of 1 available) 16142 1727204104.91148: exiting _queue_task() for managed-node2/command 16142 1727204104.91162: done queuing things up, now waiting for results queue to drain 16142 1727204104.91163: waiting for pending results... 16142 1727204104.92214: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 16142 1727204104.92605: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001ef 16142 1727204104.92626: variable 'ansible_search_path' from source: unknown 16142 1727204104.92751: variable 'ansible_search_path' from source: unknown 16142 1727204104.92798: calling self._execute() 16142 1727204104.93102: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.93112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.93125: variable 'omit' from source: magic vars 16142 1727204104.94076: variable 'ansible_distribution' from source: facts 16142 1727204104.94097: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 16142 1727204104.94415: variable 'ansible_distribution_major_version' from source: facts 16142 1727204104.94500: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 16142 1727204104.94509: when evaluation is False, skipping this task 16142 1727204104.94516: _execute() done 16142 1727204104.94523: dumping result to json 16142 1727204104.94535: done dumping result, returning 16142 1727204104.94547: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [0affcd87-79f5-fddd-f6c7-0000000001ef] 16142 1727204104.94558: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001ef skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 16142 1727204104.94842: no more pending results, returning what we have 16142 1727204104.94845: results queue empty 16142 1727204104.94846: checking for any_errors_fatal 16142 1727204104.94854: done checking for any_errors_fatal 16142 1727204104.94855: checking for max_fail_percentage 16142 1727204104.94857: done checking for max_fail_percentage 16142 1727204104.94858: checking to see if all hosts have failed and the running result is not ok 16142 1727204104.94858: done checking to see if all hosts have failed 16142 1727204104.94859: getting the remaining hosts for this loop 16142 1727204104.94860: done getting the remaining hosts for this loop 16142 1727204104.94868: getting the next task for host managed-node2 16142 1727204104.94876: done getting next task for host managed-node2 16142 1727204104.94879: ^ task is: TASK: Enable EPEL 8 16142 1727204104.94882: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204104.94886: getting variables 16142 1727204104.94888: in VariableManager get_vars() 16142 1727204104.94918: Calling all_inventory to load vars for managed-node2 16142 1727204104.94921: Calling groups_inventory to load vars for managed-node2 16142 1727204104.94925: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204104.94942: Calling all_plugins_play to load vars for managed-node2 16142 1727204104.94946: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204104.94951: Calling groups_plugins_play to load vars for managed-node2 16142 1727204104.95187: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001ef 16142 1727204104.95191: WORKER PROCESS EXITING 16142 1727204104.95205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204104.95415: done with get_vars() 16142 1727204104.95427: done getting variables 16142 1727204104.95658: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.053) 0:00:04.134 ***** 16142 1727204104.95741: entering _queue_task() for managed-node2/command 16142 1727204104.96311: worker is 1 (out of 1 available) 16142 1727204104.96323: exiting _queue_task() for managed-node2/command 16142 1727204104.96450: done queuing things up, now waiting for results queue to drain 16142 1727204104.96452: waiting for pending results... 16142 1727204104.98539: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 16142 1727204104.98924: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001f0 16142 1727204104.98949: variable 'ansible_search_path' from source: unknown 16142 1727204104.98957: variable 'ansible_search_path' from source: unknown 16142 1727204104.99042: calling self._execute() 16142 1727204104.99298: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204104.99337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204104.99362: variable 'omit' from source: magic vars 16142 1727204105.00444: variable 'ansible_distribution' from source: facts 16142 1727204105.00468: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 16142 1727204105.00859: variable 'ansible_distribution_major_version' from source: facts 16142 1727204105.00873: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 16142 1727204105.00882: when evaluation is False, skipping this task 16142 1727204105.00889: _execute() done 16142 1727204105.00897: dumping result to json 16142 1727204105.00905: done dumping result, returning 16142 1727204105.00916: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [0affcd87-79f5-fddd-f6c7-0000000001f0] 16142 1727204105.00928: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001f0 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 16142 1727204105.01213: no more pending results, returning what we have 16142 1727204105.01216: results queue empty 16142 1727204105.01217: checking for any_errors_fatal 16142 1727204105.01222: done checking for any_errors_fatal 16142 1727204105.01223: checking for max_fail_percentage 16142 1727204105.01225: done checking for max_fail_percentage 16142 1727204105.01226: checking to see if all hosts have failed and the running result is not ok 16142 1727204105.01226: done checking to see if all hosts have failed 16142 1727204105.01227: getting the remaining hosts for this loop 16142 1727204105.01228: done getting the remaining hosts for this loop 16142 1727204105.01233: getting the next task for host managed-node2 16142 1727204105.01241: done getting next task for host managed-node2 16142 1727204105.01244: ^ task is: TASK: Enable EPEL 6 16142 1727204105.01247: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204105.01250: getting variables 16142 1727204105.01252: in VariableManager get_vars() 16142 1727204105.01290: Calling all_inventory to load vars for managed-node2 16142 1727204105.01294: Calling groups_inventory to load vars for managed-node2 16142 1727204105.01298: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204105.01314: Calling all_plugins_play to load vars for managed-node2 16142 1727204105.01318: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204105.01322: Calling groups_plugins_play to load vars for managed-node2 16142 1727204105.01497: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001f0 16142 1727204105.01500: WORKER PROCESS EXITING 16142 1727204105.01520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204105.01744: done with get_vars() 16142 1727204105.01755: done getting variables 16142 1727204105.01855: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.061) 0:00:04.195 ***** 16142 1727204105.01891: entering _queue_task() for managed-node2/copy 16142 1727204105.02692: worker is 1 (out of 1 available) 16142 1727204105.02801: exiting _queue_task() for managed-node2/copy 16142 1727204105.02822: done queuing things up, now waiting for results queue to drain 16142 1727204105.02828: waiting for pending results... 16142 1727204105.03613: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 16142 1727204105.03906: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001f2 16142 1727204105.03924: variable 'ansible_search_path' from source: unknown 16142 1727204105.03934: variable 'ansible_search_path' from source: unknown 16142 1727204105.03982: calling self._execute() 16142 1727204105.04189: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204105.04326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204105.04349: variable 'omit' from source: magic vars 16142 1727204105.05229: variable 'ansible_distribution' from source: facts 16142 1727204105.05252: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 16142 1727204105.06163: variable 'ansible_distribution_major_version' from source: facts 16142 1727204105.06240: Evaluated conditional (ansible_distribution_major_version == '6'): False 16142 1727204105.06344: when evaluation is False, skipping this task 16142 1727204105.06352: _execute() done 16142 1727204105.06359: dumping result to json 16142 1727204105.06369: done dumping result, returning 16142 1727204105.06381: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [0affcd87-79f5-fddd-f6c7-0000000001f2] 16142 1727204105.06392: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001f2 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 16142 1727204105.06562: no more pending results, returning what we have 16142 1727204105.06567: results queue empty 16142 1727204105.06568: checking for any_errors_fatal 16142 1727204105.06575: done checking for any_errors_fatal 16142 1727204105.06576: checking for max_fail_percentage 16142 1727204105.06578: done checking for max_fail_percentage 16142 1727204105.06579: checking to see if all hosts have failed and the running result is not ok 16142 1727204105.06580: done checking to see if all hosts have failed 16142 1727204105.06580: getting the remaining hosts for this loop 16142 1727204105.06582: done getting the remaining hosts for this loop 16142 1727204105.06586: getting the next task for host managed-node2 16142 1727204105.06597: done getting next task for host managed-node2 16142 1727204105.06600: ^ task is: TASK: Set network provider to 'nm' 16142 1727204105.06602: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204105.06606: getting variables 16142 1727204105.06608: in VariableManager get_vars() 16142 1727204105.06641: Calling all_inventory to load vars for managed-node2 16142 1727204105.06644: Calling groups_inventory to load vars for managed-node2 16142 1727204105.06648: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204105.06662: Calling all_plugins_play to load vars for managed-node2 16142 1727204105.06668: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204105.06671: Calling groups_plugins_play to load vars for managed-node2 16142 1727204105.06920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204105.07099: done with get_vars() 16142 1727204105.07110: done getting variables 16142 1727204105.07306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204105.07441: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001f2 16142 1727204105.07444: WORKER PROCESS EXITING TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:13 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.055) 0:00:04.251 ***** 16142 1727204105.07462: entering _queue_task() for managed-node2/set_fact 16142 1727204105.08170: worker is 1 (out of 1 available) 16142 1727204105.08297: exiting _queue_task() for managed-node2/set_fact 16142 1727204105.08308: done queuing things up, now waiting for results queue to drain 16142 1727204105.08310: waiting for pending results... 16142 1727204105.08952: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 16142 1727204105.09167: in run() - task 0affcd87-79f5-fddd-f6c7-000000000007 16142 1727204105.09286: variable 'ansible_search_path' from source: unknown 16142 1727204105.09333: calling self._execute() 16142 1727204105.09487: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204105.09501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204105.09523: variable 'omit' from source: magic vars 16142 1727204105.09730: variable 'omit' from source: magic vars 16142 1727204105.09876: variable 'omit' from source: magic vars 16142 1727204105.09920: variable 'omit' from source: magic vars 16142 1727204105.10085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204105.10126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204105.10154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204105.10184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204105.10284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204105.10318: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204105.10325: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204105.10332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204105.10554: Set connection var ansible_timeout to 10 16142 1727204105.10562: Set connection var ansible_connection to ssh 16142 1727204105.10574: Set connection var ansible_shell_type to sh 16142 1727204105.10604: Set connection var ansible_shell_executable to /bin/sh 16142 1727204105.10614: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204105.10716: Set connection var ansible_pipelining to False 16142 1727204105.10744: variable 'ansible_shell_executable' from source: unknown 16142 1727204105.10751: variable 'ansible_connection' from source: unknown 16142 1727204105.10758: variable 'ansible_module_compression' from source: unknown 16142 1727204105.10766: variable 'ansible_shell_type' from source: unknown 16142 1727204105.10772: variable 'ansible_shell_executable' from source: unknown 16142 1727204105.10779: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204105.10785: variable 'ansible_pipelining' from source: unknown 16142 1727204105.10791: variable 'ansible_timeout' from source: unknown 16142 1727204105.10799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204105.11063: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204105.11154: variable 'omit' from source: magic vars 16142 1727204105.11166: starting attempt loop 16142 1727204105.11173: running the handler 16142 1727204105.11188: handler run complete 16142 1727204105.11201: attempt loop complete, returning result 16142 1727204105.11252: _execute() done 16142 1727204105.11258: dumping result to json 16142 1727204105.11270: done dumping result, returning 16142 1727204105.11276: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [0affcd87-79f5-fddd-f6c7-000000000007] 16142 1727204105.11286: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000007 ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 16142 1727204105.11441: no more pending results, returning what we have 16142 1727204105.11447: results queue empty 16142 1727204105.11448: checking for any_errors_fatal 16142 1727204105.11456: done checking for any_errors_fatal 16142 1727204105.11457: checking for max_fail_percentage 16142 1727204105.11460: done checking for max_fail_percentage 16142 1727204105.11460: checking to see if all hosts have failed and the running result is not ok 16142 1727204105.11461: done checking to see if all hosts have failed 16142 1727204105.11462: getting the remaining hosts for this loop 16142 1727204105.11465: done getting the remaining hosts for this loop 16142 1727204105.11469: getting the next task for host managed-node2 16142 1727204105.11478: done getting next task for host managed-node2 16142 1727204105.11480: ^ task is: TASK: meta (flush_handlers) 16142 1727204105.11482: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204105.11487: getting variables 16142 1727204105.11489: in VariableManager get_vars() 16142 1727204105.11522: Calling all_inventory to load vars for managed-node2 16142 1727204105.11525: Calling groups_inventory to load vars for managed-node2 16142 1727204105.11528: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204105.11541: Calling all_plugins_play to load vars for managed-node2 16142 1727204105.11544: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204105.11547: Calling groups_plugins_play to load vars for managed-node2 16142 1727204105.11720: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000007 16142 1727204105.11723: WORKER PROCESS EXITING 16142 1727204105.11748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204105.11944: done with get_vars() 16142 1727204105.11955: done getting variables 16142 1727204105.12033: in VariableManager get_vars() 16142 1727204105.12044: Calling all_inventory to load vars for managed-node2 16142 1727204105.12047: Calling groups_inventory to load vars for managed-node2 16142 1727204105.12049: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204105.12054: Calling all_plugins_play to load vars for managed-node2 16142 1727204105.12056: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204105.12208: Calling groups_plugins_play to load vars for managed-node2 16142 1727204105.13360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204105.13897: done with get_vars() 16142 1727204105.13916: done queuing things up, now waiting for results queue to drain 16142 1727204105.13918: results queue empty 16142 1727204105.13919: checking for any_errors_fatal 16142 1727204105.13922: done checking for any_errors_fatal 16142 1727204105.13923: checking for max_fail_percentage 16142 1727204105.13924: done checking for max_fail_percentage 16142 1727204105.13925: checking to see if all hosts have failed and the running result is not ok 16142 1727204105.13925: done checking to see if all hosts have failed 16142 1727204105.13926: getting the remaining hosts for this loop 16142 1727204105.13928: done getting the remaining hosts for this loop 16142 1727204105.13933: getting the next task for host managed-node2 16142 1727204105.13938: done getting next task for host managed-node2 16142 1727204105.13940: ^ task is: TASK: meta (flush_handlers) 16142 1727204105.14054: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204105.14067: getting variables 16142 1727204105.14068: in VariableManager get_vars() 16142 1727204105.14078: Calling all_inventory to load vars for managed-node2 16142 1727204105.14081: Calling groups_inventory to load vars for managed-node2 16142 1727204105.14083: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204105.14089: Calling all_plugins_play to load vars for managed-node2 16142 1727204105.14091: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204105.14094: Calling groups_plugins_play to load vars for managed-node2 16142 1727204105.14346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204105.14761: done with get_vars() 16142 1727204105.14875: done getting variables 16142 1727204105.15047: in VariableManager get_vars() 16142 1727204105.15058: Calling all_inventory to load vars for managed-node2 16142 1727204105.15060: Calling groups_inventory to load vars for managed-node2 16142 1727204105.15062: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204105.15069: Calling all_plugins_play to load vars for managed-node2 16142 1727204105.15072: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204105.15075: Calling groups_plugins_play to load vars for managed-node2 16142 1727204105.15315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204105.15515: done with get_vars() 16142 1727204105.15530: done queuing things up, now waiting for results queue to drain 16142 1727204105.15533: results queue empty 16142 1727204105.15534: checking for any_errors_fatal 16142 1727204105.15536: done checking for any_errors_fatal 16142 1727204105.15536: checking for max_fail_percentage 16142 1727204105.15538: done checking for max_fail_percentage 16142 1727204105.15538: checking to see if all hosts have failed and the running result is not ok 16142 1727204105.15539: done checking to see if all hosts have failed 16142 1727204105.15540: getting the remaining hosts for this loop 16142 1727204105.15541: done getting the remaining hosts for this loop 16142 1727204105.15544: getting the next task for host managed-node2 16142 1727204105.15547: done getting next task for host managed-node2 16142 1727204105.15548: ^ task is: None 16142 1727204105.15549: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204105.15551: done queuing things up, now waiting for results queue to drain 16142 1727204105.15551: results queue empty 16142 1727204105.15552: checking for any_errors_fatal 16142 1727204105.15553: done checking for any_errors_fatal 16142 1727204105.15553: checking for max_fail_percentage 16142 1727204105.15554: done checking for max_fail_percentage 16142 1727204105.15555: checking to see if all hosts have failed and the running result is not ok 16142 1727204105.15555: done checking to see if all hosts have failed 16142 1727204105.15557: getting the next task for host managed-node2 16142 1727204105.15560: done getting next task for host managed-node2 16142 1727204105.15561: ^ task is: None 16142 1727204105.15562: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204105.15734: in VariableManager get_vars() 16142 1727204105.15775: done with get_vars() 16142 1727204105.15782: in VariableManager get_vars() 16142 1727204105.15880: done with get_vars() 16142 1727204105.15886: variable 'omit' from source: magic vars 16142 1727204105.15922: in VariableManager get_vars() 16142 1727204105.15945: done with get_vars() 16142 1727204105.15968: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 16142 1727204105.18349: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16142 1727204105.18834: getting the remaining hosts for this loop 16142 1727204105.18836: done getting the remaining hosts for this loop 16142 1727204105.18840: getting the next task for host managed-node2 16142 1727204105.18843: done getting next task for host managed-node2 16142 1727204105.18846: ^ task is: TASK: Gathering Facts 16142 1727204105.18847: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204105.18849: getting variables 16142 1727204105.18850: in VariableManager get_vars() 16142 1727204105.18880: Calling all_inventory to load vars for managed-node2 16142 1727204105.18883: Calling groups_inventory to load vars for managed-node2 16142 1727204105.18885: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204105.18891: Calling all_plugins_play to load vars for managed-node2 16142 1727204105.18905: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204105.18909: Calling groups_plugins_play to load vars for managed-node2 16142 1727204105.19120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204105.19318: done with get_vars() 16142 1727204105.19328: done getting variables 16142 1727204105.19497: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.120) 0:00:04.372 ***** 16142 1727204105.19525: entering _queue_task() for managed-node2/gather_facts 16142 1727204105.20272: worker is 1 (out of 1 available) 16142 1727204105.20283: exiting _queue_task() for managed-node2/gather_facts 16142 1727204105.20295: done queuing things up, now waiting for results queue to drain 16142 1727204105.20296: waiting for pending results... 16142 1727204105.21229: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16142 1727204105.21477: in run() - task 0affcd87-79f5-fddd-f6c7-000000000218 16142 1727204105.21496: variable 'ansible_search_path' from source: unknown 16142 1727204105.21551: calling self._execute() 16142 1727204105.21705: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204105.21847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204105.21862: variable 'omit' from source: magic vars 16142 1727204105.22721: variable 'ansible_distribution_major_version' from source: facts 16142 1727204105.22743: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204105.22755: variable 'omit' from source: magic vars 16142 1727204105.22791: variable 'omit' from source: magic vars 16142 1727204105.22967: variable 'omit' from source: magic vars 16142 1727204105.23015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204105.23070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204105.23166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204105.23190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204105.23261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204105.23297: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204105.23363: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204105.23374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204105.23597: Set connection var ansible_timeout to 10 16142 1727204105.23605: Set connection var ansible_connection to ssh 16142 1727204105.23617: Set connection var ansible_shell_type to sh 16142 1727204105.23628: Set connection var ansible_shell_executable to /bin/sh 16142 1727204105.23685: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204105.23698: Set connection var ansible_pipelining to False 16142 1727204105.23725: variable 'ansible_shell_executable' from source: unknown 16142 1727204105.23789: variable 'ansible_connection' from source: unknown 16142 1727204105.23802: variable 'ansible_module_compression' from source: unknown 16142 1727204105.23811: variable 'ansible_shell_type' from source: unknown 16142 1727204105.23819: variable 'ansible_shell_executable' from source: unknown 16142 1727204105.23826: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204105.23837: variable 'ansible_pipelining' from source: unknown 16142 1727204105.23844: variable 'ansible_timeout' from source: unknown 16142 1727204105.23851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204105.24200: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204105.24348: variable 'omit' from source: magic vars 16142 1727204105.24359: starting attempt loop 16142 1727204105.24368: running the handler 16142 1727204105.24390: variable 'ansible_facts' from source: unknown 16142 1727204105.24466: _low_level_execute_command(): starting 16142 1727204105.24480: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204105.26621: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.26726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204105.26766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.26770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.26773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.26888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204105.26938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204105.26961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204105.27045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204105.29288: stdout chunk (state=3): >>>/root <<< 16142 1727204105.29448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204105.29536: stderr chunk (state=3): >>><<< 16142 1727204105.29540: stdout chunk (state=3): >>><<< 16142 1727204105.29677: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204105.29680: _low_level_execute_command(): starting 16142 1727204105.29683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532 `" && echo ansible-tmp-1727204105.295701-16541-142554764454532="` echo /root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532 `" ) && sleep 0' 16142 1727204105.31769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.31774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204105.31940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.31953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204105.31956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.32043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204105.32136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204105.32143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204105.32238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204105.34825: stdout chunk (state=3): >>>ansible-tmp-1727204105.295701-16541-142554764454532=/root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532 <<< 16142 1727204105.35079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204105.35084: stdout chunk (state=3): >>><<< 16142 1727204105.35087: stderr chunk (state=3): >>><<< 16142 1727204105.35171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204105.295701-16541-142554764454532=/root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204105.35175: variable 'ansible_module_compression' from source: unknown 16142 1727204105.35372: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16142 1727204105.35375: variable 'ansible_facts' from source: unknown 16142 1727204105.35454: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532/AnsiballZ_setup.py 16142 1727204105.36118: Sending initial data 16142 1727204105.36122: Sent initial data (153 bytes) 16142 1727204105.39318: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204105.39454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204105.39485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.39518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204105.39585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204105.39598: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204105.39615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.39634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204105.39670: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204105.39786: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204105.39801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204105.39816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.39834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204105.39847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204105.39859: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204105.39876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.39956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204105.40017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204105.40036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204105.40157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204105.42496: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204105.42536: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204105.42575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp4zl5op8m /root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532/AnsiballZ_setup.py <<< 16142 1727204105.42617: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204105.45813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204105.45818: stderr chunk (state=3): >>><<< 16142 1727204105.45822: stdout chunk (state=3): >>><<< 16142 1727204105.45850: done transferring module to remote 16142 1727204105.45861: _low_level_execute_command(): starting 16142 1727204105.45867: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532/ /root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532/AnsiballZ_setup.py && sleep 0' 16142 1727204105.46916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204105.46966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204105.46985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.47005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204105.47062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204105.47094: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204105.47111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.47145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204105.47169: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204105.47181: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204105.47194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204105.47217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.47239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204105.47258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204105.47277: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204105.47293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.47375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204105.47401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204105.47416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204105.47500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204105.50519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204105.50523: stdout chunk (state=3): >>><<< 16142 1727204105.50526: stderr chunk (state=3): >>><<< 16142 1727204105.50630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204105.50638: _low_level_execute_command(): starting 16142 1727204105.50641: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532/AnsiballZ_setup.py && sleep 0' 16142 1727204105.53275: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.53279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204105.53336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.53339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204105.53342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204105.53737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204105.53833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204105.53844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204105.53939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204106.21402: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.47, "5m": 0.29, "15m": 0.14}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "05", "epoch": "1727204105", "epoch_int": "1727204105", "date": "2024-09-24", "time": "14:55:05", "iso8601_micro": "2024-09-24T18:55:05.920142Z", "iso8601": "2024-09-24T18:55:05Z", "iso8601_basic": "20240924T145505920142", "iso8601_basic_short": "20240924T145505", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:<<< 16142 1727204106.21418: stdout chunk (state=3): >>>f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]"<<< 16142 1727204106.21444: stdout chunk (state=3): >>>, "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2781, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 751, "free": 2781}, "nocache": {"free": 3254, "used": 278}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 468, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264274022400, "block_size": 4096, "block_total": 65519355, "block_available": 64520025, "block_used": 999330, "inode_total": 131071472, "inode_available": 130998226, "inode_used": 73246, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16142 1727204106.23788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204106.23908: stderr chunk (state=3): >>><<< 16142 1727204106.23912: stdout chunk (state=3): >>><<< 16142 1727204106.24180: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.47, "5m": 0.29, "15m": 0.14}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "05", "epoch": "1727204105", "epoch_int": "1727204105", "date": "2024-09-24", "time": "14:55:05", "iso8601_micro": "2024-09-24T18:55:05.920142Z", "iso8601": "2024-09-24T18:55:05Z", "iso8601_basic": "20240924T145505920142", "iso8601_basic_short": "20240924T145505", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2781, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 751, "free": 2781}, "nocache": {"free": 3254, "used": 278}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 468, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264274022400, "block_size": 4096, "block_total": 65519355, "block_available": 64520025, "block_used": 999330, "inode_total": 131071472, "inode_available": 130998226, "inode_used": 73246, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204106.24373: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204106.24405: _low_level_execute_command(): starting 16142 1727204106.24414: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204105.295701-16541-142554764454532/ > /dev/null 2>&1 && sleep 0' 16142 1727204106.25117: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204106.25138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204106.25166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.25186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.25234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204106.25250: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204106.25277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.25297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204106.25310: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204106.25321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204106.25338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204106.25354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.25385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.25399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204106.25410: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204106.25424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.25515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204106.25541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204106.25560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204106.25648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204106.28268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204106.28272: stdout chunk (state=3): >>><<< 16142 1727204106.28275: stderr chunk (state=3): >>><<< 16142 1727204106.28871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204106.28876: handler run complete 16142 1727204106.28878: variable 'ansible_facts' from source: unknown 16142 1727204106.28881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204106.28883: variable 'ansible_facts' from source: unknown 16142 1727204106.28924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204106.29058: attempt loop complete, returning result 16142 1727204106.29073: _execute() done 16142 1727204106.29081: dumping result to json 16142 1727204106.29114: done dumping result, returning 16142 1727204106.29129: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [0affcd87-79f5-fddd-f6c7-000000000218] 16142 1727204106.29146: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000218 ok: [managed-node2] 16142 1727204106.29744: no more pending results, returning what we have 16142 1727204106.29748: results queue empty 16142 1727204106.29749: checking for any_errors_fatal 16142 1727204106.29751: done checking for any_errors_fatal 16142 1727204106.29751: checking for max_fail_percentage 16142 1727204106.29755: done checking for max_fail_percentage 16142 1727204106.29756: checking to see if all hosts have failed and the running result is not ok 16142 1727204106.29757: done checking to see if all hosts have failed 16142 1727204106.29758: getting the remaining hosts for this loop 16142 1727204106.29762: done getting the remaining hosts for this loop 16142 1727204106.29771: getting the next task for host managed-node2 16142 1727204106.29780: done getting next task for host managed-node2 16142 1727204106.29782: ^ task is: TASK: meta (flush_handlers) 16142 1727204106.29784: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204106.29787: getting variables 16142 1727204106.29789: in VariableManager get_vars() 16142 1727204106.29841: Calling all_inventory to load vars for managed-node2 16142 1727204106.29844: Calling groups_inventory to load vars for managed-node2 16142 1727204106.29846: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204106.29857: Calling all_plugins_play to load vars for managed-node2 16142 1727204106.29859: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204106.29862: Calling groups_plugins_play to load vars for managed-node2 16142 1727204106.30042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204106.30518: done with get_vars() 16142 1727204106.30529: done getting variables 16142 1727204106.30566: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000218 16142 1727204106.30570: WORKER PROCESS EXITING 16142 1727204106.30614: in VariableManager get_vars() 16142 1727204106.30637: Calling all_inventory to load vars for managed-node2 16142 1727204106.30639: Calling groups_inventory to load vars for managed-node2 16142 1727204106.30641: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204106.30646: Calling all_plugins_play to load vars for managed-node2 16142 1727204106.30649: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204106.30656: Calling groups_plugins_play to load vars for managed-node2 16142 1727204106.30792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204106.30980: done with get_vars() 16142 1727204106.30993: done queuing things up, now waiting for results queue to drain 16142 1727204106.30996: results queue empty 16142 1727204106.30996: checking for any_errors_fatal 16142 1727204106.31000: done checking for any_errors_fatal 16142 1727204106.31001: checking for max_fail_percentage 16142 1727204106.31002: done checking for max_fail_percentage 16142 1727204106.31003: checking to see if all hosts have failed and the running result is not ok 16142 1727204106.31003: done checking to see if all hosts have failed 16142 1727204106.31004: getting the remaining hosts for this loop 16142 1727204106.31005: done getting the remaining hosts for this loop 16142 1727204106.31008: getting the next task for host managed-node2 16142 1727204106.31011: done getting next task for host managed-node2 16142 1727204106.31013: ^ task is: TASK: INIT Prepare setup 16142 1727204106.31015: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204106.31017: getting variables 16142 1727204106.31018: in VariableManager get_vars() 16142 1727204106.31039: Calling all_inventory to load vars for managed-node2 16142 1727204106.31042: Calling groups_inventory to load vars for managed-node2 16142 1727204106.31044: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204106.31048: Calling all_plugins_play to load vars for managed-node2 16142 1727204106.31051: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204106.31054: Calling groups_plugins_play to load vars for managed-node2 16142 1727204106.31186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204106.31371: done with get_vars() 16142 1727204106.31380: done getting variables 16142 1727204106.31457: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Tuesday 24 September 2024 14:55:06 -0400 (0:00:01.119) 0:00:05.491 ***** 16142 1727204106.31488: entering _queue_task() for managed-node2/debug 16142 1727204106.31490: Creating lock for debug 16142 1727204106.31787: worker is 1 (out of 1 available) 16142 1727204106.31799: exiting _queue_task() for managed-node2/debug 16142 1727204106.31810: done queuing things up, now waiting for results queue to drain 16142 1727204106.31811: waiting for pending results... 16142 1727204106.32077: running TaskExecutor() for managed-node2/TASK: INIT Prepare setup 16142 1727204106.32183: in run() - task 0affcd87-79f5-fddd-f6c7-00000000000b 16142 1727204106.32203: variable 'ansible_search_path' from source: unknown 16142 1727204106.32246: calling self._execute() 16142 1727204106.32414: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204106.32425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204106.32441: variable 'omit' from source: magic vars 16142 1727204106.32792: variable 'ansible_distribution_major_version' from source: facts 16142 1727204106.32811: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204106.32823: variable 'omit' from source: magic vars 16142 1727204106.32851: variable 'omit' from source: magic vars 16142 1727204106.32897: variable 'omit' from source: magic vars 16142 1727204106.32950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204106.32992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204106.33026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204106.33052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204106.33072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204106.33109: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204106.33118: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204106.33128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204106.33247: Set connection var ansible_timeout to 10 16142 1727204106.33255: Set connection var ansible_connection to ssh 16142 1727204106.33266: Set connection var ansible_shell_type to sh 16142 1727204106.33277: Set connection var ansible_shell_executable to /bin/sh 16142 1727204106.33287: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204106.33300: Set connection var ansible_pipelining to False 16142 1727204106.33327: variable 'ansible_shell_executable' from source: unknown 16142 1727204106.33344: variable 'ansible_connection' from source: unknown 16142 1727204106.33352: variable 'ansible_module_compression' from source: unknown 16142 1727204106.33359: variable 'ansible_shell_type' from source: unknown 16142 1727204106.33369: variable 'ansible_shell_executable' from source: unknown 16142 1727204106.33377: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204106.33384: variable 'ansible_pipelining' from source: unknown 16142 1727204106.33391: variable 'ansible_timeout' from source: unknown 16142 1727204106.33398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204106.33548: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204106.33573: variable 'omit' from source: magic vars 16142 1727204106.33584: starting attempt loop 16142 1727204106.33590: running the handler 16142 1727204106.33643: handler run complete 16142 1727204106.33678: attempt loop complete, returning result 16142 1727204106.33686: _execute() done 16142 1727204106.33693: dumping result to json 16142 1727204106.33700: done dumping result, returning 16142 1727204106.33712: done running TaskExecutor() for managed-node2/TASK: INIT Prepare setup [0affcd87-79f5-fddd-f6c7-00000000000b] 16142 1727204106.33722: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000000b ok: [managed-node2] => {} MSG: ################################################## 16142 1727204106.33949: no more pending results, returning what we have 16142 1727204106.33952: results queue empty 16142 1727204106.33953: checking for any_errors_fatal 16142 1727204106.33955: done checking for any_errors_fatal 16142 1727204106.33955: checking for max_fail_percentage 16142 1727204106.33957: done checking for max_fail_percentage 16142 1727204106.33958: checking to see if all hosts have failed and the running result is not ok 16142 1727204106.33959: done checking to see if all hosts have failed 16142 1727204106.33960: getting the remaining hosts for this loop 16142 1727204106.33961: done getting the remaining hosts for this loop 16142 1727204106.33966: getting the next task for host managed-node2 16142 1727204106.33974: done getting next task for host managed-node2 16142 1727204106.33977: ^ task is: TASK: Install dnsmasq 16142 1727204106.33981: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204106.33985: getting variables 16142 1727204106.33987: in VariableManager get_vars() 16142 1727204106.34048: Calling all_inventory to load vars for managed-node2 16142 1727204106.34051: Calling groups_inventory to load vars for managed-node2 16142 1727204106.34054: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204106.34068: Calling all_plugins_play to load vars for managed-node2 16142 1727204106.34071: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204106.34077: Calling groups_plugins_play to load vars for managed-node2 16142 1727204106.34241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204106.34443: done with get_vars() 16142 1727204106.34455: done getting variables 16142 1727204106.34746: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000000b 16142 1727204106.34749: WORKER PROCESS EXITING 16142 1727204106.34787: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.033) 0:00:05.525 ***** 16142 1727204106.34820: entering _queue_task() for managed-node2/package 16142 1727204106.35074: worker is 1 (out of 1 available) 16142 1727204106.35085: exiting _queue_task() for managed-node2/package 16142 1727204106.35098: done queuing things up, now waiting for results queue to drain 16142 1727204106.35099: waiting for pending results... 16142 1727204106.35353: running TaskExecutor() for managed-node2/TASK: Install dnsmasq 16142 1727204106.35473: in run() - task 0affcd87-79f5-fddd-f6c7-00000000000f 16142 1727204106.35488: variable 'ansible_search_path' from source: unknown 16142 1727204106.35494: variable 'ansible_search_path' from source: unknown 16142 1727204106.35536: calling self._execute() 16142 1727204106.35614: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204106.35625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204106.35640: variable 'omit' from source: magic vars 16142 1727204106.36000: variable 'ansible_distribution_major_version' from source: facts 16142 1727204106.36018: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204106.36028: variable 'omit' from source: magic vars 16142 1727204106.36082: variable 'omit' from source: magic vars 16142 1727204106.36277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204106.39583: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204106.39663: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204106.39708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204106.39749: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204106.39788: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204106.39893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204106.39927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204106.39963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204106.40018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204106.40043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204106.40157: variable '__network_is_ostree' from source: set_fact 16142 1727204106.40171: variable 'omit' from source: magic vars 16142 1727204106.40210: variable 'omit' from source: magic vars 16142 1727204106.40245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204106.40279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204106.40308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204106.40337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204106.40451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204106.40487: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204106.40495: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204106.40503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204106.40613: Set connection var ansible_timeout to 10 16142 1727204106.40621: Set connection var ansible_connection to ssh 16142 1727204106.40633: Set connection var ansible_shell_type to sh 16142 1727204106.40642: Set connection var ansible_shell_executable to /bin/sh 16142 1727204106.40649: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204106.40657: Set connection var ansible_pipelining to False 16142 1727204106.40690: variable 'ansible_shell_executable' from source: unknown 16142 1727204106.40697: variable 'ansible_connection' from source: unknown 16142 1727204106.40704: variable 'ansible_module_compression' from source: unknown 16142 1727204106.40709: variable 'ansible_shell_type' from source: unknown 16142 1727204106.40715: variable 'ansible_shell_executable' from source: unknown 16142 1727204106.40721: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204106.40728: variable 'ansible_pipelining' from source: unknown 16142 1727204106.40737: variable 'ansible_timeout' from source: unknown 16142 1727204106.40744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204106.40852: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204106.40870: variable 'omit' from source: magic vars 16142 1727204106.40884: starting attempt loop 16142 1727204106.40892: running the handler 16142 1727204106.40902: variable 'ansible_facts' from source: unknown 16142 1727204106.40909: variable 'ansible_facts' from source: unknown 16142 1727204106.40950: _low_level_execute_command(): starting 16142 1727204106.40962: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204106.41685: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204106.41701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204106.41719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.41742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.41789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204106.41800: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204106.41813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.41829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204106.41845: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204106.41860: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204106.41876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204106.41890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.41907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.41919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204106.41930: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204106.41947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.42030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204106.42063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204106.42086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204106.42168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204106.44513: stdout chunk (state=3): >>>/root <<< 16142 1727204106.44778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204106.44782: stdout chunk (state=3): >>><<< 16142 1727204106.44785: stderr chunk (state=3): >>><<< 16142 1727204106.44905: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204106.44910: _low_level_execute_command(): starting 16142 1727204106.44913: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628 `" && echo ansible-tmp-1727204106.4480667-16589-259579564818628="` echo /root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628 `" ) && sleep 0' 16142 1727204106.47793: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.47798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.47827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204106.47831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.47834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.47913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204106.47916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204106.47918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204106.47989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204106.50569: stdout chunk (state=3): >>>ansible-tmp-1727204106.4480667-16589-259579564818628=/root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628 <<< 16142 1727204106.50719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204106.50807: stderr chunk (state=3): >>><<< 16142 1727204106.50812: stdout chunk (state=3): >>><<< 16142 1727204106.50972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204106.4480667-16589-259579564818628=/root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204106.50975: variable 'ansible_module_compression' from source: unknown 16142 1727204106.50979: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 16142 1727204106.50982: ANSIBALLZ: Acquiring lock 16142 1727204106.50984: ANSIBALLZ: Lock acquired: 140089297016096 16142 1727204106.50986: ANSIBALLZ: Creating module 16142 1727204106.80787: ANSIBALLZ: Writing module into payload 16142 1727204106.81074: ANSIBALLZ: Writing module 16142 1727204106.81107: ANSIBALLZ: Renaming module 16142 1727204106.81124: ANSIBALLZ: Done creating module 16142 1727204106.81147: variable 'ansible_facts' from source: unknown 16142 1727204106.81240: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628/AnsiballZ_dnf.py 16142 1727204106.81406: Sending initial data 16142 1727204106.81409: Sent initial data (152 bytes) 16142 1727204106.84837: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.84841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.84876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.84880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.84883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.85069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204106.85072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204106.85075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204106.85143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204106.87728: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204106.87768: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204106.87822: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpkyr6hj_p /root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628/AnsiballZ_dnf.py <<< 16142 1727204106.87866: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204106.89509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204106.89768: stderr chunk (state=3): >>><<< 16142 1727204106.89771: stdout chunk (state=3): >>><<< 16142 1727204106.89774: done transferring module to remote 16142 1727204106.89776: _low_level_execute_command(): starting 16142 1727204106.89778: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628/ /root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628/AnsiballZ_dnf.py && sleep 0' 16142 1727204106.90323: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204106.90338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204106.90356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.90379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.90422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204106.90435: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204106.90452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.90474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204106.90486: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204106.90497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204106.90509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204106.90523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.90539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.90552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204106.90569: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204106.90584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.90657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204106.90682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204106.90698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204106.90777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204106.93287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204106.93370: stderr chunk (state=3): >>><<< 16142 1727204106.93374: stdout chunk (state=3): >>><<< 16142 1727204106.93477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16142 1727204106.93480: _low_level_execute_command(): starting 16142 1727204106.93483: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628/AnsiballZ_dnf.py && sleep 0' 16142 1727204106.94125: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204106.94157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204106.94180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.94198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.94241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204106.94266: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204106.94282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.94300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204106.94312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204106.94322: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204106.94334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204106.94348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204106.94383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204106.94396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204106.94410: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204106.94425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204106.94512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204106.94535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204106.94553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204106.94686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16142 1727204108.07423: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 16142 1727204108.11508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204108.11518: stdout chunk (state=3): >>><<< 16142 1727204108.11521: stderr chunk (state=3): >>><<< 16142 1727204108.11675: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204108.11680: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204108.11683: _low_level_execute_command(): starting 16142 1727204108.11685: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204106.4480667-16589-259579564818628/ > /dev/null 2>&1 && sleep 0' 16142 1727204108.12949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204108.13000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.13003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204108.13040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204108.13043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204108.13045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204108.13048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.13138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204108.13144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204108.13147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204108.13204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204108.15034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204108.15120: stderr chunk (state=3): >>><<< 16142 1727204108.15124: stdout chunk (state=3): >>><<< 16142 1727204108.15372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204108.15376: handler run complete 16142 1727204108.15379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204108.15521: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204108.15572: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204108.15620: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204108.15654: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204108.15739: variable '__install_status' from source: unknown 16142 1727204108.15763: Evaluated conditional (__install_status is success): True 16142 1727204108.15786: attempt loop complete, returning result 16142 1727204108.15812: _execute() done 16142 1727204108.15822: dumping result to json 16142 1727204108.15832: done dumping result, returning 16142 1727204108.15880: done running TaskExecutor() for managed-node2/TASK: Install dnsmasq [0affcd87-79f5-fddd-f6c7-00000000000f] 16142 1727204108.15891: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000000f ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 16142 1727204108.16117: no more pending results, returning what we have 16142 1727204108.16121: results queue empty 16142 1727204108.16122: checking for any_errors_fatal 16142 1727204108.16128: done checking for any_errors_fatal 16142 1727204108.16129: checking for max_fail_percentage 16142 1727204108.16131: done checking for max_fail_percentage 16142 1727204108.16132: checking to see if all hosts have failed and the running result is not ok 16142 1727204108.16137: done checking to see if all hosts have failed 16142 1727204108.16138: getting the remaining hosts for this loop 16142 1727204108.16139: done getting the remaining hosts for this loop 16142 1727204108.16143: getting the next task for host managed-node2 16142 1727204108.16150: done getting next task for host managed-node2 16142 1727204108.16153: ^ task is: TASK: Install pgrep, sysctl 16142 1727204108.16156: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204108.16159: getting variables 16142 1727204108.16161: in VariableManager get_vars() 16142 1727204108.16220: Calling all_inventory to load vars for managed-node2 16142 1727204108.16222: Calling groups_inventory to load vars for managed-node2 16142 1727204108.16226: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204108.16236: Calling all_plugins_play to load vars for managed-node2 16142 1727204108.16239: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204108.16243: Calling groups_plugins_play to load vars for managed-node2 16142 1727204108.16458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204108.16707: done with get_vars() 16142 1727204108.16718: done getting variables 16142 1727204108.16874: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000000f 16142 1727204108.16877: WORKER PROCESS EXITING 16142 1727204108.16904: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:55:08 -0400 (0:00:01.821) 0:00:07.346 ***** 16142 1727204108.16937: entering _queue_task() for managed-node2/package 16142 1727204108.17426: worker is 1 (out of 1 available) 16142 1727204108.17437: exiting _queue_task() for managed-node2/package 16142 1727204108.17448: done queuing things up, now waiting for results queue to drain 16142 1727204108.17450: waiting for pending results... 16142 1727204108.17694: running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl 16142 1727204108.17824: in run() - task 0affcd87-79f5-fddd-f6c7-000000000010 16142 1727204108.17847: variable 'ansible_search_path' from source: unknown 16142 1727204108.17856: variable 'ansible_search_path' from source: unknown 16142 1727204108.17902: calling self._execute() 16142 1727204108.17990: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204108.18002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204108.18017: variable 'omit' from source: magic vars 16142 1727204108.18548: variable 'ansible_distribution_major_version' from source: facts 16142 1727204108.18571: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204108.18681: variable 'ansible_os_family' from source: facts 16142 1727204108.18695: Evaluated conditional (ansible_os_family == 'RedHat'): True 16142 1727204108.18851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204108.19961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204108.20017: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204108.20175: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204108.20215: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204108.20420: variable 'ansible_distribution_major_version' from source: facts 16142 1727204108.20440: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 16142 1727204108.20447: when evaluation is False, skipping this task 16142 1727204108.20455: _execute() done 16142 1727204108.20462: dumping result to json 16142 1727204108.20471: done dumping result, returning 16142 1727204108.20486: done running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl [0affcd87-79f5-fddd-f6c7-000000000010] 16142 1727204108.20497: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000010 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 16142 1727204108.20659: no more pending results, returning what we have 16142 1727204108.20663: results queue empty 16142 1727204108.20666: checking for any_errors_fatal 16142 1727204108.20674: done checking for any_errors_fatal 16142 1727204108.20675: checking for max_fail_percentage 16142 1727204108.20677: done checking for max_fail_percentage 16142 1727204108.20677: checking to see if all hosts have failed and the running result is not ok 16142 1727204108.20678: done checking to see if all hosts have failed 16142 1727204108.20679: getting the remaining hosts for this loop 16142 1727204108.20680: done getting the remaining hosts for this loop 16142 1727204108.20686: getting the next task for host managed-node2 16142 1727204108.20693: done getting next task for host managed-node2 16142 1727204108.20696: ^ task is: TASK: Install pgrep, sysctl 16142 1727204108.20698: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204108.20703: getting variables 16142 1727204108.20705: in VariableManager get_vars() 16142 1727204108.20766: Calling all_inventory to load vars for managed-node2 16142 1727204108.20769: Calling groups_inventory to load vars for managed-node2 16142 1727204108.20772: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204108.20787: Calling all_plugins_play to load vars for managed-node2 16142 1727204108.20791: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204108.20799: Calling groups_plugins_play to load vars for managed-node2 16142 1727204108.21026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204108.21236: done with get_vars() 16142 1727204108.21249: done getting variables 16142 1727204108.21317: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.044) 0:00:07.390 ***** 16142 1727204108.21356: entering _queue_task() for managed-node2/package 16142 1727204108.22035: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000010 16142 1727204108.22039: WORKER PROCESS EXITING 16142 1727204108.22278: worker is 1 (out of 1 available) 16142 1727204108.22373: exiting _queue_task() for managed-node2/package 16142 1727204108.22387: done queuing things up, now waiting for results queue to drain 16142 1727204108.22389: waiting for pending results... 16142 1727204108.22655: running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl 16142 1727204108.22759: in run() - task 0affcd87-79f5-fddd-f6c7-000000000011 16142 1727204108.22777: variable 'ansible_search_path' from source: unknown 16142 1727204108.22781: variable 'ansible_search_path' from source: unknown 16142 1727204108.22816: calling self._execute() 16142 1727204108.22909: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204108.22913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204108.22924: variable 'omit' from source: magic vars 16142 1727204108.23543: variable 'ansible_distribution_major_version' from source: facts 16142 1727204108.23560: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204108.23686: variable 'ansible_os_family' from source: facts 16142 1727204108.23698: Evaluated conditional (ansible_os_family == 'RedHat'): True 16142 1727204108.23918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204108.24249: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204108.24389: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204108.24491: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204108.24528: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204108.24605: variable 'ansible_distribution_major_version' from source: facts 16142 1727204108.24621: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 16142 1727204108.24634: variable 'omit' from source: magic vars 16142 1727204108.24686: variable 'omit' from source: magic vars 16142 1727204108.24844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204108.28515: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204108.28620: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204108.28684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204108.28803: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204108.29103: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204108.29205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204108.29245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204108.29280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204108.29336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204108.29359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204108.29537: variable '__network_is_ostree' from source: set_fact 16142 1727204108.29739: variable 'omit' from source: magic vars 16142 1727204108.29776: variable 'omit' from source: magic vars 16142 1727204108.29807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204108.29843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204108.29870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204108.29892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204108.29958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204108.29995: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204108.30100: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204108.30109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204108.30217: Set connection var ansible_timeout to 10 16142 1727204108.30304: Set connection var ansible_connection to ssh 16142 1727204108.30339: Set connection var ansible_shell_type to sh 16142 1727204108.30362: Set connection var ansible_shell_executable to /bin/sh 16142 1727204108.30376: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204108.30390: Set connection var ansible_pipelining to False 16142 1727204108.30416: variable 'ansible_shell_executable' from source: unknown 16142 1727204108.30425: variable 'ansible_connection' from source: unknown 16142 1727204108.30435: variable 'ansible_module_compression' from source: unknown 16142 1727204108.30443: variable 'ansible_shell_type' from source: unknown 16142 1727204108.30450: variable 'ansible_shell_executable' from source: unknown 16142 1727204108.30456: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204108.30466: variable 'ansible_pipelining' from source: unknown 16142 1727204108.30482: variable 'ansible_timeout' from source: unknown 16142 1727204108.30493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204108.30598: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204108.30615: variable 'omit' from source: magic vars 16142 1727204108.30626: starting attempt loop 16142 1727204108.30637: running the handler 16142 1727204108.30648: variable 'ansible_facts' from source: unknown 16142 1727204108.30659: variable 'ansible_facts' from source: unknown 16142 1727204108.30693: _low_level_execute_command(): starting 16142 1727204108.30699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204108.31681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204108.31697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204108.31711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.31729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204108.31776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204108.31789: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204108.31803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.31820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204108.31840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204108.31853: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204108.31868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204108.31890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.31908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204108.31922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204108.31933: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204108.31947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.32050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204108.32081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204108.32162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204108.32240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204108.33883: stdout chunk (state=3): >>>/root <<< 16142 1727204108.33989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204108.34050: stderr chunk (state=3): >>><<< 16142 1727204108.34052: stdout chunk (state=3): >>><<< 16142 1727204108.34070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204108.34081: _low_level_execute_command(): starting 16142 1727204108.34087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212 `" && echo ansible-tmp-1727204108.3407135-16764-164653606726212="` echo /root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212 `" ) && sleep 0' 16142 1727204108.34524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.34528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204108.34560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.34565: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.34568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.34624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204108.34630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204108.34634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204108.34675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204108.36540: stdout chunk (state=3): >>>ansible-tmp-1727204108.3407135-16764-164653606726212=/root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212 <<< 16142 1727204108.36649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204108.36720: stderr chunk (state=3): >>><<< 16142 1727204108.36723: stdout chunk (state=3): >>><<< 16142 1727204108.36773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204108.3407135-16764-164653606726212=/root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204108.36873: variable 'ansible_module_compression' from source: unknown 16142 1727204108.36982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 16142 1727204108.36985: variable 'ansible_facts' from source: unknown 16142 1727204108.37024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212/AnsiballZ_dnf.py 16142 1727204108.37173: Sending initial data 16142 1727204108.37182: Sent initial data (152 bytes) 16142 1727204108.37855: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204108.37859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204108.37897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.37900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.37903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.37962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204108.37970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204108.37973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204108.38006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204108.39727: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 16142 1727204108.39745: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204108.39772: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204108.39845: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpgwgw0nhx /root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212/AnsiballZ_dnf.py <<< 16142 1727204108.39874: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204108.41252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204108.41374: stderr chunk (state=3): >>><<< 16142 1727204108.41378: stdout chunk (state=3): >>><<< 16142 1727204108.41394: done transferring module to remote 16142 1727204108.41404: _low_level_execute_command(): starting 16142 1727204108.41410: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212/ /root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212/AnsiballZ_dnf.py && sleep 0' 16142 1727204108.41859: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.41867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204108.41918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.41922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.41925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204108.41927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.41980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204108.41993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204108.42040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204108.43800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204108.43879: stderr chunk (state=3): >>><<< 16142 1727204108.43888: stdout chunk (state=3): >>><<< 16142 1727204108.43978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204108.43981: _low_level_execute_command(): starting 16142 1727204108.43983: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212/AnsiballZ_dnf.py && sleep 0' 16142 1727204108.44502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204108.44516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204108.44528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.44546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204108.44590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204108.44600: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204108.44611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.44627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204108.44640: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204108.44648: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204108.44657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204108.44670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204108.44683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204108.44692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204108.44700: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204108.44710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204108.44784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204108.44803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204108.44817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204108.44893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204109.37801: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 16142 1727204109.41973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204109.41978: stdout chunk (state=3): >>><<< 16142 1727204109.41981: stderr chunk (state=3): >>><<< 16142 1727204109.42144: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204109.42148: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204109.42152: _low_level_execute_command(): starting 16142 1727204109.42154: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204108.3407135-16764-164653606726212/ > /dev/null 2>&1 && sleep 0' 16142 1727204109.42656: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.42660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.42699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204109.42703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.42706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.42772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204109.42775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204109.42778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204109.42845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204109.44649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204109.44835: stderr chunk (state=3): >>><<< 16142 1727204109.44839: stdout chunk (state=3): >>><<< 16142 1727204109.45074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204109.45079: handler run complete 16142 1727204109.45081: attempt loop complete, returning result 16142 1727204109.45084: _execute() done 16142 1727204109.45086: dumping result to json 16142 1727204109.45088: done dumping result, returning 16142 1727204109.45090: done running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl [0affcd87-79f5-fddd-f6c7-000000000011] 16142 1727204109.45092: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000011 16142 1727204109.45177: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000011 16142 1727204109.45181: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 16142 1727204109.45265: no more pending results, returning what we have 16142 1727204109.45269: results queue empty 16142 1727204109.45270: checking for any_errors_fatal 16142 1727204109.45277: done checking for any_errors_fatal 16142 1727204109.45278: checking for max_fail_percentage 16142 1727204109.45280: done checking for max_fail_percentage 16142 1727204109.45281: checking to see if all hosts have failed and the running result is not ok 16142 1727204109.45282: done checking to see if all hosts have failed 16142 1727204109.45282: getting the remaining hosts for this loop 16142 1727204109.45284: done getting the remaining hosts for this loop 16142 1727204109.45288: getting the next task for host managed-node2 16142 1727204109.45296: done getting next task for host managed-node2 16142 1727204109.45299: ^ task is: TASK: Create test interfaces 16142 1727204109.45301: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204109.45308: getting variables 16142 1727204109.45310: in VariableManager get_vars() 16142 1727204109.45369: Calling all_inventory to load vars for managed-node2 16142 1727204109.45373: Calling groups_inventory to load vars for managed-node2 16142 1727204109.45375: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204109.45386: Calling all_plugins_play to load vars for managed-node2 16142 1727204109.45389: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204109.45391: Calling groups_plugins_play to load vars for managed-node2 16142 1727204109.45829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204109.46052: done with get_vars() 16142 1727204109.46066: done getting variables 16142 1727204109.46168: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:55:09 -0400 (0:00:01.248) 0:00:08.638 ***** 16142 1727204109.46199: entering _queue_task() for managed-node2/shell 16142 1727204109.46201: Creating lock for shell 16142 1727204109.46490: worker is 1 (out of 1 available) 16142 1727204109.46502: exiting _queue_task() for managed-node2/shell 16142 1727204109.46513: done queuing things up, now waiting for results queue to drain 16142 1727204109.46515: waiting for pending results... 16142 1727204109.46785: running TaskExecutor() for managed-node2/TASK: Create test interfaces 16142 1727204109.46911: in run() - task 0affcd87-79f5-fddd-f6c7-000000000012 16142 1727204109.46930: variable 'ansible_search_path' from source: unknown 16142 1727204109.46940: variable 'ansible_search_path' from source: unknown 16142 1727204109.46989: calling self._execute() 16142 1727204109.47082: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204109.47097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204109.47111: variable 'omit' from source: magic vars 16142 1727204109.47510: variable 'ansible_distribution_major_version' from source: facts 16142 1727204109.47538: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204109.47551: variable 'omit' from source: magic vars 16142 1727204109.47618: variable 'omit' from source: magic vars 16142 1727204109.48049: variable 'dhcp_interface1' from source: play vars 16142 1727204109.48062: variable 'dhcp_interface2' from source: play vars 16142 1727204109.48103: variable 'omit' from source: magic vars 16142 1727204109.48151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204109.48198: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204109.48223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204109.48290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204109.48386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204109.48425: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204109.48436: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204109.48444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204109.48587: Set connection var ansible_timeout to 10 16142 1727204109.48678: Set connection var ansible_connection to ssh 16142 1727204109.48712: Set connection var ansible_shell_type to sh 16142 1727204109.48729: Set connection var ansible_shell_executable to /bin/sh 16142 1727204109.48820: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204109.48840: Set connection var ansible_pipelining to False 16142 1727204109.48869: variable 'ansible_shell_executable' from source: unknown 16142 1727204109.48877: variable 'ansible_connection' from source: unknown 16142 1727204109.48883: variable 'ansible_module_compression' from source: unknown 16142 1727204109.48889: variable 'ansible_shell_type' from source: unknown 16142 1727204109.48895: variable 'ansible_shell_executable' from source: unknown 16142 1727204109.48901: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204109.48908: variable 'ansible_pipelining' from source: unknown 16142 1727204109.48923: variable 'ansible_timeout' from source: unknown 16142 1727204109.48945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204109.49207: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204109.49279: variable 'omit' from source: magic vars 16142 1727204109.49365: starting attempt loop 16142 1727204109.49376: running the handler 16142 1727204109.49389: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204109.49412: _low_level_execute_command(): starting 16142 1727204109.49423: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204109.50624: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204109.50642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.50656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.50683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.50729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204109.50747: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204109.50762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.50786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204109.50800: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204109.50817: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204109.50829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.50847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.50863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.50880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204109.50891: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204109.50903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.50986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204109.51003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204109.51017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204109.51085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204109.52657: stdout chunk (state=3): >>>/root <<< 16142 1727204109.52824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204109.52828: stdout chunk (state=3): >>><<< 16142 1727204109.52838: stderr chunk (state=3): >>><<< 16142 1727204109.52859: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204109.52878: _low_level_execute_command(): starting 16142 1727204109.52884: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138 `" && echo ansible-tmp-1727204109.5286114-17094-4107409490138="` echo /root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138 `" ) && sleep 0' 16142 1727204109.53505: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204109.53513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.53524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.53540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.53580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204109.53586: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204109.53596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.53611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204109.53616: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204109.53623: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204109.53630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.53641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.53650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.53657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204109.53665: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204109.53678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.53757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204109.53765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204109.53773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204109.53847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204109.55684: stdout chunk (state=3): >>>ansible-tmp-1727204109.5286114-17094-4107409490138=/root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138 <<< 16142 1727204109.55796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204109.55887: stderr chunk (state=3): >>><<< 16142 1727204109.55899: stdout chunk (state=3): >>><<< 16142 1727204109.55970: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204109.5286114-17094-4107409490138=/root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204109.56273: variable 'ansible_module_compression' from source: unknown 16142 1727204109.56276: ANSIBALLZ: Using generic lock for ansible.legacy.command 16142 1727204109.56279: ANSIBALLZ: Acquiring lock 16142 1727204109.56281: ANSIBALLZ: Lock acquired: 140089297016096 16142 1727204109.56283: ANSIBALLZ: Creating module 16142 1727204109.78261: ANSIBALLZ: Writing module into payload 16142 1727204109.78392: ANSIBALLZ: Writing module 16142 1727204109.78434: ANSIBALLZ: Renaming module 16142 1727204109.78452: ANSIBALLZ: Done creating module 16142 1727204109.78479: variable 'ansible_facts' from source: unknown 16142 1727204109.78581: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138/AnsiballZ_command.py 16142 1727204109.78765: Sending initial data 16142 1727204109.78769: Sent initial data (154 bytes) 16142 1727204109.79861: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204109.79884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.79901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.79921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.79973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204109.79990: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204109.80005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.80024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204109.80038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204109.80053: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204109.80069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.80086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.80107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.80120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204109.80135: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204109.80150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.80237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204109.80260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204109.80283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204109.80366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204109.82221: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204109.82257: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204109.82300: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpcilonu8x /root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138/AnsiballZ_command.py <<< 16142 1727204109.82334: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204109.83941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204109.84062: stderr chunk (state=3): >>><<< 16142 1727204109.84068: stdout chunk (state=3): >>><<< 16142 1727204109.84070: done transferring module to remote 16142 1727204109.84072: _low_level_execute_command(): starting 16142 1727204109.84075: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138/ /root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138/AnsiballZ_command.py && sleep 0' 16142 1727204109.85496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204109.85616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.85620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.85666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204109.85669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.85672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204109.85674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.85861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204109.85866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204109.85885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204109.86042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204109.87782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204109.87837: stderr chunk (state=3): >>><<< 16142 1727204109.87840: stdout chunk (state=3): >>><<< 16142 1727204109.87860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204109.87863: _low_level_execute_command(): starting 16142 1727204109.87869: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138/AnsiballZ_command.py && sleep 0' 16142 1727204109.89593: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204109.89602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.89613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.89627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.89674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204109.89790: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204109.89800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.89813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204109.89820: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204109.89826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204109.89837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204109.89846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204109.89857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204109.89866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204109.89874: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204109.89883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204109.89960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204109.90012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204109.90022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204109.90188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.24501: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6823 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6823 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:55:10.034197", "end": "2024-09-24 14:55:11.243747", "delta": "0:00:01.209550", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204111.25825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204111.25880: stderr chunk (state=3): >>><<< 16142 1727204111.25884: stdout chunk (state=3): >>><<< 16142 1727204111.25908: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6823 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6823 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:55:10.034197", "end": "2024-09-24 14:55:11.243747", "delta": "0:00:01.209550", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204111.25951: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204111.25958: _low_level_execute_command(): starting 16142 1727204111.25963: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204109.5286114-17094-4107409490138/ > /dev/null 2>&1 && sleep 0' 16142 1727204111.26560: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204111.26577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.26657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.28583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.28587: stdout chunk (state=3): >>><<< 16142 1727204111.28590: stderr chunk (state=3): >>><<< 16142 1727204111.29062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204111.29068: handler run complete 16142 1727204111.29070: Evaluated conditional (False): False 16142 1727204111.29072: attempt loop complete, returning result 16142 1727204111.29074: _execute() done 16142 1727204111.29076: dumping result to json 16142 1727204111.29078: done dumping result, returning 16142 1727204111.29080: done running TaskExecutor() for managed-node2/TASK: Create test interfaces [0affcd87-79f5-fddd-f6c7-000000000012] 16142 1727204111.29082: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000012 16142 1727204111.29158: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000012 16142 1727204111.29161: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.209550", "end": "2024-09-24 14:55:11.243747", "rc": 0, "start": "2024-09-24 14:55:10.034197" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 6823 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 6823 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 16142 1727204111.29249: no more pending results, returning what we have 16142 1727204111.29253: results queue empty 16142 1727204111.29254: checking for any_errors_fatal 16142 1727204111.29260: done checking for any_errors_fatal 16142 1727204111.29261: checking for max_fail_percentage 16142 1727204111.29262: done checking for max_fail_percentage 16142 1727204111.29267: checking to see if all hosts have failed and the running result is not ok 16142 1727204111.29268: done checking to see if all hosts have failed 16142 1727204111.29269: getting the remaining hosts for this loop 16142 1727204111.29270: done getting the remaining hosts for this loop 16142 1727204111.29273: getting the next task for host managed-node2 16142 1727204111.29281: done getting next task for host managed-node2 16142 1727204111.29284: ^ task is: TASK: Include the task 'get_interface_stat.yml' 16142 1727204111.29286: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204111.29290: getting variables 16142 1727204111.29291: in VariableManager get_vars() 16142 1727204111.29341: Calling all_inventory to load vars for managed-node2 16142 1727204111.29344: Calling groups_inventory to load vars for managed-node2 16142 1727204111.29347: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204111.29357: Calling all_plugins_play to load vars for managed-node2 16142 1727204111.29359: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204111.29362: Calling groups_plugins_play to load vars for managed-node2 16142 1727204111.29541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204111.29755: done with get_vars() 16142 1727204111.29770: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:55:11 -0400 (0:00:01.836) 0:00:10.475 ***** 16142 1727204111.29870: entering _queue_task() for managed-node2/include_tasks 16142 1727204111.30162: worker is 1 (out of 1 available) 16142 1727204111.30176: exiting _queue_task() for managed-node2/include_tasks 16142 1727204111.30187: done queuing things up, now waiting for results queue to drain 16142 1727204111.30188: waiting for pending results... 16142 1727204111.30468: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 16142 1727204111.30604: in run() - task 0affcd87-79f5-fddd-f6c7-000000000016 16142 1727204111.30626: variable 'ansible_search_path' from source: unknown 16142 1727204111.30640: variable 'ansible_search_path' from source: unknown 16142 1727204111.30686: calling self._execute() 16142 1727204111.30804: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.30826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.30860: variable 'omit' from source: magic vars 16142 1727204111.31288: variable 'ansible_distribution_major_version' from source: facts 16142 1727204111.31307: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204111.31319: _execute() done 16142 1727204111.31328: dumping result to json 16142 1727204111.31340: done dumping result, returning 16142 1727204111.31351: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-fddd-f6c7-000000000016] 16142 1727204111.31362: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000016 16142 1727204111.31519: no more pending results, returning what we have 16142 1727204111.31525: in VariableManager get_vars() 16142 1727204111.31595: Calling all_inventory to load vars for managed-node2 16142 1727204111.31599: Calling groups_inventory to load vars for managed-node2 16142 1727204111.31602: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204111.31616: Calling all_plugins_play to load vars for managed-node2 16142 1727204111.31620: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204111.31623: Calling groups_plugins_play to load vars for managed-node2 16142 1727204111.32166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204111.32617: done with get_vars() 16142 1727204111.32625: variable 'ansible_search_path' from source: unknown 16142 1727204111.32626: variable 'ansible_search_path' from source: unknown 16142 1727204111.32801: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000016 16142 1727204111.32805: WORKER PROCESS EXITING 16142 1727204111.32839: we have included files to process 16142 1727204111.32840: generating all_blocks data 16142 1727204111.32842: done generating all_blocks data 16142 1727204111.32843: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204111.32844: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204111.32846: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204111.33092: done processing included file 16142 1727204111.33094: iterating over new_blocks loaded from include file 16142 1727204111.33096: in VariableManager get_vars() 16142 1727204111.33120: done with get_vars() 16142 1727204111.33122: filtering new block on tags 16142 1727204111.33139: done filtering new block on tags 16142 1727204111.33141: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 16142 1727204111.33145: extending task lists for all hosts with included blocks 16142 1727204111.33242: done extending task lists 16142 1727204111.33243: done processing included files 16142 1727204111.33244: results queue empty 16142 1727204111.33245: checking for any_errors_fatal 16142 1727204111.33250: done checking for any_errors_fatal 16142 1727204111.33251: checking for max_fail_percentage 16142 1727204111.33252: done checking for max_fail_percentage 16142 1727204111.33253: checking to see if all hosts have failed and the running result is not ok 16142 1727204111.33253: done checking to see if all hosts have failed 16142 1727204111.33254: getting the remaining hosts for this loop 16142 1727204111.33255: done getting the remaining hosts for this loop 16142 1727204111.33257: getting the next task for host managed-node2 16142 1727204111.33261: done getting next task for host managed-node2 16142 1727204111.33267: ^ task is: TASK: Get stat for interface {{ interface }} 16142 1727204111.33269: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204111.33272: getting variables 16142 1727204111.33273: in VariableManager get_vars() 16142 1727204111.33291: Calling all_inventory to load vars for managed-node2 16142 1727204111.33293: Calling groups_inventory to load vars for managed-node2 16142 1727204111.33295: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204111.33301: Calling all_plugins_play to load vars for managed-node2 16142 1727204111.33303: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204111.33306: Calling groups_plugins_play to load vars for managed-node2 16142 1727204111.33428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204111.33626: done with get_vars() 16142 1727204111.33637: done getting variables 16142 1727204111.33797: variable 'interface' from source: task vars 16142 1727204111.33802: variable 'dhcp_interface1' from source: play vars 16142 1727204111.33871: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.040) 0:00:10.515 ***** 16142 1727204111.33912: entering _queue_task() for managed-node2/stat 16142 1727204111.34197: worker is 1 (out of 1 available) 16142 1727204111.34207: exiting _queue_task() for managed-node2/stat 16142 1727204111.34218: done queuing things up, now waiting for results queue to drain 16142 1727204111.34220: waiting for pending results... 16142 1727204111.34478: running TaskExecutor() for managed-node2/TASK: Get stat for interface test1 16142 1727204111.34619: in run() - task 0affcd87-79f5-fddd-f6c7-000000000248 16142 1727204111.34638: variable 'ansible_search_path' from source: unknown 16142 1727204111.34645: variable 'ansible_search_path' from source: unknown 16142 1727204111.34689: calling self._execute() 16142 1727204111.34778: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.34787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.34800: variable 'omit' from source: magic vars 16142 1727204111.35169: variable 'ansible_distribution_major_version' from source: facts 16142 1727204111.35187: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204111.35197: variable 'omit' from source: magic vars 16142 1727204111.35267: variable 'omit' from source: magic vars 16142 1727204111.35371: variable 'interface' from source: task vars 16142 1727204111.35380: variable 'dhcp_interface1' from source: play vars 16142 1727204111.35453: variable 'dhcp_interface1' from source: play vars 16142 1727204111.35477: variable 'omit' from source: magic vars 16142 1727204111.35521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204111.35568: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204111.35673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204111.35697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204111.35715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204111.35753: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204111.35761: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.35771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.35881: Set connection var ansible_timeout to 10 16142 1727204111.35889: Set connection var ansible_connection to ssh 16142 1727204111.35897: Set connection var ansible_shell_type to sh 16142 1727204111.35906: Set connection var ansible_shell_executable to /bin/sh 16142 1727204111.35914: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204111.35926: Set connection var ansible_pipelining to False 16142 1727204111.35956: variable 'ansible_shell_executable' from source: unknown 16142 1727204111.35971: variable 'ansible_connection' from source: unknown 16142 1727204111.35980: variable 'ansible_module_compression' from source: unknown 16142 1727204111.35987: variable 'ansible_shell_type' from source: unknown 16142 1727204111.35993: variable 'ansible_shell_executable' from source: unknown 16142 1727204111.35999: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.36006: variable 'ansible_pipelining' from source: unknown 16142 1727204111.36014: variable 'ansible_timeout' from source: unknown 16142 1727204111.36021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.36228: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204111.36245: variable 'omit' from source: magic vars 16142 1727204111.36254: starting attempt loop 16142 1727204111.36259: running the handler 16142 1727204111.36276: _low_level_execute_command(): starting 16142 1727204111.36290: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204111.37159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204111.37182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.37198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.37219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.37274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204111.37288: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204111.37303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.37322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204111.37339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204111.37351: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204111.37363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.37383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.37401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.37414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204111.37425: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204111.37443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.37541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.37568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204111.37586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.37677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.39278: stdout chunk (state=3): >>>/root <<< 16142 1727204111.39382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.39481: stderr chunk (state=3): >>><<< 16142 1727204111.39498: stdout chunk (state=3): >>><<< 16142 1727204111.39626: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204111.39630: _low_level_execute_command(): starting 16142 1727204111.39636: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994 `" && echo ansible-tmp-1727204111.395331-17157-105790363209994="` echo /root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994 `" ) && sleep 0' 16142 1727204111.40316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204111.40331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.40358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.40384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.40449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204111.40472: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204111.40490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.40518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204111.40530: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204111.40553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204111.40575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.40590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.40621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.40642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204111.40662: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204111.40679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.40777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.40812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204111.40851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.40953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.42824: stdout chunk (state=3): >>>ansible-tmp-1727204111.395331-17157-105790363209994=/root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994 <<< 16142 1727204111.42942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.42999: stderr chunk (state=3): >>><<< 16142 1727204111.43002: stdout chunk (state=3): >>><<< 16142 1727204111.43018: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204111.395331-17157-105790363209994=/root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204111.43060: variable 'ansible_module_compression' from source: unknown 16142 1727204111.43109: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16142 1727204111.43141: variable 'ansible_facts' from source: unknown 16142 1727204111.43206: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994/AnsiballZ_stat.py 16142 1727204111.43314: Sending initial data 16142 1727204111.43318: Sent initial data (152 bytes) 16142 1727204111.44267: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.44271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.44317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.44322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204111.44338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.44341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.44357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204111.44361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.44448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.44467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204111.44475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.44537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.46251: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204111.46285: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204111.46320: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpnrrw8ppu /root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994/AnsiballZ_stat.py <<< 16142 1727204111.46354: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204111.47243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.47427: stderr chunk (state=3): >>><<< 16142 1727204111.47430: stdout chunk (state=3): >>><<< 16142 1727204111.47451: done transferring module to remote 16142 1727204111.47462: _low_level_execute_command(): starting 16142 1727204111.47469: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994/ /root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994/AnsiballZ_stat.py && sleep 0' 16142 1727204111.48140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204111.48149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.48159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.48176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.48221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204111.48228: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204111.48238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.48251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204111.48258: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204111.48266: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204111.48275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.48284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.48295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.48301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204111.48317: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204111.48327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.48401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.48421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204111.48445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.48517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.50220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.50279: stderr chunk (state=3): >>><<< 16142 1727204111.50282: stdout chunk (state=3): >>><<< 16142 1727204111.50296: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204111.50299: _low_level_execute_command(): starting 16142 1727204111.50304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994/AnsiballZ_stat.py && sleep 0' 16142 1727204111.51349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204111.51386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.51389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.51391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.51393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204111.51395: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204111.51397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.51399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204111.51400: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204111.51402: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204111.51404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.51406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.51407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.51409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204111.51411: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204111.51413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.51415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.51416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204111.51418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.51433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.64669: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27004, "dev": 21, "nlink": 1, "atime": 1727204110.04222, "mtime": 1727204110.04222, "ctime": 1727204110.04222, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16142 1727204111.65710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204111.65769: stderr chunk (state=3): >>><<< 16142 1727204111.65773: stdout chunk (state=3): >>><<< 16142 1727204111.65793: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27004, "dev": 21, "nlink": 1, "atime": 1727204110.04222, "mtime": 1727204110.04222, "ctime": 1727204110.04222, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204111.65830: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204111.65842: _low_level_execute_command(): starting 16142 1727204111.65845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204111.395331-17157-105790363209994/ > /dev/null 2>&1 && sleep 0' 16142 1727204111.66320: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.66332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.66361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.66376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.66424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.66437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204111.66448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.66495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.68322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.68387: stderr chunk (state=3): >>><<< 16142 1727204111.68390: stdout chunk (state=3): >>><<< 16142 1727204111.68405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204111.68412: handler run complete 16142 1727204111.68447: attempt loop complete, returning result 16142 1727204111.68450: _execute() done 16142 1727204111.68452: dumping result to json 16142 1727204111.68457: done dumping result, returning 16142 1727204111.68469: done running TaskExecutor() for managed-node2/TASK: Get stat for interface test1 [0affcd87-79f5-fddd-f6c7-000000000248] 16142 1727204111.68472: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000248 16142 1727204111.68583: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000248 16142 1727204111.68586: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204110.04222, "block_size": 4096, "blocks": 0, "ctime": 1727204110.04222, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27004, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204110.04222, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 16142 1727204111.68678: no more pending results, returning what we have 16142 1727204111.68682: results queue empty 16142 1727204111.68683: checking for any_errors_fatal 16142 1727204111.68685: done checking for any_errors_fatal 16142 1727204111.68685: checking for max_fail_percentage 16142 1727204111.68687: done checking for max_fail_percentage 16142 1727204111.68688: checking to see if all hosts have failed and the running result is not ok 16142 1727204111.68689: done checking to see if all hosts have failed 16142 1727204111.68689: getting the remaining hosts for this loop 16142 1727204111.68691: done getting the remaining hosts for this loop 16142 1727204111.68694: getting the next task for host managed-node2 16142 1727204111.68701: done getting next task for host managed-node2 16142 1727204111.68704: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 16142 1727204111.68707: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204111.68710: getting variables 16142 1727204111.68712: in VariableManager get_vars() 16142 1727204111.68757: Calling all_inventory to load vars for managed-node2 16142 1727204111.68759: Calling groups_inventory to load vars for managed-node2 16142 1727204111.68761: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204111.68773: Calling all_plugins_play to load vars for managed-node2 16142 1727204111.68777: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204111.68780: Calling groups_plugins_play to load vars for managed-node2 16142 1727204111.68934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204111.69054: done with get_vars() 16142 1727204111.69063: done getting variables 16142 1727204111.69137: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 16142 1727204111.69230: variable 'interface' from source: task vars 16142 1727204111.69234: variable 'dhcp_interface1' from source: play vars 16142 1727204111.69279: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.353) 0:00:10.869 ***** 16142 1727204111.69302: entering _queue_task() for managed-node2/assert 16142 1727204111.69304: Creating lock for assert 16142 1727204111.69511: worker is 1 (out of 1 available) 16142 1727204111.69523: exiting _queue_task() for managed-node2/assert 16142 1727204111.69537: done queuing things up, now waiting for results queue to drain 16142 1727204111.69538: waiting for pending results... 16142 1727204111.69693: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test1' 16142 1727204111.69774: in run() - task 0affcd87-79f5-fddd-f6c7-000000000017 16142 1727204111.69783: variable 'ansible_search_path' from source: unknown 16142 1727204111.69786: variable 'ansible_search_path' from source: unknown 16142 1727204111.69814: calling self._execute() 16142 1727204111.69881: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.69886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.69896: variable 'omit' from source: magic vars 16142 1727204111.70157: variable 'ansible_distribution_major_version' from source: facts 16142 1727204111.70168: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204111.70175: variable 'omit' from source: magic vars 16142 1727204111.70208: variable 'omit' from source: magic vars 16142 1727204111.70279: variable 'interface' from source: task vars 16142 1727204111.70283: variable 'dhcp_interface1' from source: play vars 16142 1727204111.70329: variable 'dhcp_interface1' from source: play vars 16142 1727204111.70344: variable 'omit' from source: magic vars 16142 1727204111.70377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204111.70402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204111.70421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204111.70437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204111.70447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204111.70472: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204111.70475: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.70477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.70549: Set connection var ansible_timeout to 10 16142 1727204111.70552: Set connection var ansible_connection to ssh 16142 1727204111.70555: Set connection var ansible_shell_type to sh 16142 1727204111.70561: Set connection var ansible_shell_executable to /bin/sh 16142 1727204111.70567: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204111.70574: Set connection var ansible_pipelining to False 16142 1727204111.70591: variable 'ansible_shell_executable' from source: unknown 16142 1727204111.70594: variable 'ansible_connection' from source: unknown 16142 1727204111.70596: variable 'ansible_module_compression' from source: unknown 16142 1727204111.70598: variable 'ansible_shell_type' from source: unknown 16142 1727204111.70600: variable 'ansible_shell_executable' from source: unknown 16142 1727204111.70602: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.70607: variable 'ansible_pipelining' from source: unknown 16142 1727204111.70609: variable 'ansible_timeout' from source: unknown 16142 1727204111.70613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.70715: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204111.70722: variable 'omit' from source: magic vars 16142 1727204111.70728: starting attempt loop 16142 1727204111.70730: running the handler 16142 1727204111.70828: variable 'interface_stat' from source: set_fact 16142 1727204111.70846: Evaluated conditional (interface_stat.stat.exists): True 16142 1727204111.70849: handler run complete 16142 1727204111.70861: attempt loop complete, returning result 16142 1727204111.70865: _execute() done 16142 1727204111.70869: dumping result to json 16142 1727204111.70871: done dumping result, returning 16142 1727204111.70881: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test1' [0affcd87-79f5-fddd-f6c7-000000000017] 16142 1727204111.70884: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000017 16142 1727204111.70959: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000017 16142 1727204111.70962: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204111.71013: no more pending results, returning what we have 16142 1727204111.71017: results queue empty 16142 1727204111.71017: checking for any_errors_fatal 16142 1727204111.71027: done checking for any_errors_fatal 16142 1727204111.71027: checking for max_fail_percentage 16142 1727204111.71029: done checking for max_fail_percentage 16142 1727204111.71030: checking to see if all hosts have failed and the running result is not ok 16142 1727204111.71031: done checking to see if all hosts have failed 16142 1727204111.71032: getting the remaining hosts for this loop 16142 1727204111.71033: done getting the remaining hosts for this loop 16142 1727204111.71036: getting the next task for host managed-node2 16142 1727204111.71044: done getting next task for host managed-node2 16142 1727204111.71047: ^ task is: TASK: Include the task 'get_interface_stat.yml' 16142 1727204111.71049: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204111.71053: getting variables 16142 1727204111.71055: in VariableManager get_vars() 16142 1727204111.71107: Calling all_inventory to load vars for managed-node2 16142 1727204111.71110: Calling groups_inventory to load vars for managed-node2 16142 1727204111.71112: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204111.71120: Calling all_plugins_play to load vars for managed-node2 16142 1727204111.71123: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204111.71125: Calling groups_plugins_play to load vars for managed-node2 16142 1727204111.71240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204111.71362: done with get_vars() 16142 1727204111.71371: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.021) 0:00:10.891 ***** 16142 1727204111.71435: entering _queue_task() for managed-node2/include_tasks 16142 1727204111.71618: worker is 1 (out of 1 available) 16142 1727204111.71630: exiting _queue_task() for managed-node2/include_tasks 16142 1727204111.71641: done queuing things up, now waiting for results queue to drain 16142 1727204111.71643: waiting for pending results... 16142 1727204111.71795: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 16142 1727204111.71872: in run() - task 0affcd87-79f5-fddd-f6c7-00000000001b 16142 1727204111.71883: variable 'ansible_search_path' from source: unknown 16142 1727204111.71886: variable 'ansible_search_path' from source: unknown 16142 1727204111.71915: calling self._execute() 16142 1727204111.72041: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.72049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.72057: variable 'omit' from source: magic vars 16142 1727204111.72311: variable 'ansible_distribution_major_version' from source: facts 16142 1727204111.72320: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204111.72325: _execute() done 16142 1727204111.72328: dumping result to json 16142 1727204111.72332: done dumping result, returning 16142 1727204111.72341: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-fddd-f6c7-00000000001b] 16142 1727204111.72352: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000001b 16142 1727204111.72435: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000001b 16142 1727204111.72437: WORKER PROCESS EXITING 16142 1727204111.72470: no more pending results, returning what we have 16142 1727204111.72475: in VariableManager get_vars() 16142 1727204111.72585: Calling all_inventory to load vars for managed-node2 16142 1727204111.72587: Calling groups_inventory to load vars for managed-node2 16142 1727204111.72589: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204111.72596: Calling all_plugins_play to load vars for managed-node2 16142 1727204111.72598: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204111.72599: Calling groups_plugins_play to load vars for managed-node2 16142 1727204111.72700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204111.72813: done with get_vars() 16142 1727204111.72818: variable 'ansible_search_path' from source: unknown 16142 1727204111.72819: variable 'ansible_search_path' from source: unknown 16142 1727204111.72842: we have included files to process 16142 1727204111.72843: generating all_blocks data 16142 1727204111.72844: done generating all_blocks data 16142 1727204111.72847: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204111.72848: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204111.72849: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204111.72972: done processing included file 16142 1727204111.72973: iterating over new_blocks loaded from include file 16142 1727204111.72974: in VariableManager get_vars() 16142 1727204111.72993: done with get_vars() 16142 1727204111.72994: filtering new block on tags 16142 1727204111.73005: done filtering new block on tags 16142 1727204111.73006: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 16142 1727204111.73009: extending task lists for all hosts with included blocks 16142 1727204111.73071: done extending task lists 16142 1727204111.73072: done processing included files 16142 1727204111.73072: results queue empty 16142 1727204111.73073: checking for any_errors_fatal 16142 1727204111.73074: done checking for any_errors_fatal 16142 1727204111.73075: checking for max_fail_percentage 16142 1727204111.73076: done checking for max_fail_percentage 16142 1727204111.73076: checking to see if all hosts have failed and the running result is not ok 16142 1727204111.73077: done checking to see if all hosts have failed 16142 1727204111.73077: getting the remaining hosts for this loop 16142 1727204111.73078: done getting the remaining hosts for this loop 16142 1727204111.73079: getting the next task for host managed-node2 16142 1727204111.73082: done getting next task for host managed-node2 16142 1727204111.73084: ^ task is: TASK: Get stat for interface {{ interface }} 16142 1727204111.73085: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204111.73087: getting variables 16142 1727204111.73087: in VariableManager get_vars() 16142 1727204111.73101: Calling all_inventory to load vars for managed-node2 16142 1727204111.73102: Calling groups_inventory to load vars for managed-node2 16142 1727204111.73104: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204111.73107: Calling all_plugins_play to load vars for managed-node2 16142 1727204111.73109: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204111.73110: Calling groups_plugins_play to load vars for managed-node2 16142 1727204111.73192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204111.73323: done with get_vars() 16142 1727204111.73329: done getting variables 16142 1727204111.73434: variable 'interface' from source: task vars 16142 1727204111.73437: variable 'dhcp_interface2' from source: play vars 16142 1727204111.73478: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.020) 0:00:10.911 ***** 16142 1727204111.73499: entering _queue_task() for managed-node2/stat 16142 1727204111.73682: worker is 1 (out of 1 available) 16142 1727204111.73694: exiting _queue_task() for managed-node2/stat 16142 1727204111.73705: done queuing things up, now waiting for results queue to drain 16142 1727204111.73707: waiting for pending results... 16142 1727204111.73861: running TaskExecutor() for managed-node2/TASK: Get stat for interface test2 16142 1727204111.73955: in run() - task 0affcd87-79f5-fddd-f6c7-000000000260 16142 1727204111.73965: variable 'ansible_search_path' from source: unknown 16142 1727204111.73974: variable 'ansible_search_path' from source: unknown 16142 1727204111.74005: calling self._execute() 16142 1727204111.74071: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.74076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.74084: variable 'omit' from source: magic vars 16142 1727204111.74341: variable 'ansible_distribution_major_version' from source: facts 16142 1727204111.74351: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204111.74357: variable 'omit' from source: magic vars 16142 1727204111.74395: variable 'omit' from source: magic vars 16142 1727204111.74465: variable 'interface' from source: task vars 16142 1727204111.74469: variable 'dhcp_interface2' from source: play vars 16142 1727204111.74516: variable 'dhcp_interface2' from source: play vars 16142 1727204111.74529: variable 'omit' from source: magic vars 16142 1727204111.74563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204111.74590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204111.74608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204111.74621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204111.74637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204111.74662: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204111.74667: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.74670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.74742: Set connection var ansible_timeout to 10 16142 1727204111.74749: Set connection var ansible_connection to ssh 16142 1727204111.74754: Set connection var ansible_shell_type to sh 16142 1727204111.74760: Set connection var ansible_shell_executable to /bin/sh 16142 1727204111.74766: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204111.74773: Set connection var ansible_pipelining to False 16142 1727204111.74788: variable 'ansible_shell_executable' from source: unknown 16142 1727204111.74791: variable 'ansible_connection' from source: unknown 16142 1727204111.74795: variable 'ansible_module_compression' from source: unknown 16142 1727204111.74797: variable 'ansible_shell_type' from source: unknown 16142 1727204111.74800: variable 'ansible_shell_executable' from source: unknown 16142 1727204111.74802: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204111.74804: variable 'ansible_pipelining' from source: unknown 16142 1727204111.74806: variable 'ansible_timeout' from source: unknown 16142 1727204111.74813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204111.74963: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204111.74970: variable 'omit' from source: magic vars 16142 1727204111.74977: starting attempt loop 16142 1727204111.74979: running the handler 16142 1727204111.74991: _low_level_execute_command(): starting 16142 1727204111.74997: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204111.75523: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.75543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.75556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.75572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.75582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.75625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.75641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.75697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.77312: stdout chunk (state=3): >>>/root <<< 16142 1727204111.77416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.77474: stderr chunk (state=3): >>><<< 16142 1727204111.77478: stdout chunk (state=3): >>><<< 16142 1727204111.77498: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204111.77508: _low_level_execute_command(): starting 16142 1727204111.77514: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605 `" && echo ansible-tmp-1727204111.7749717-17174-197521405950605="` echo /root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605 `" ) && sleep 0' 16142 1727204111.77968: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.77985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.77997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204111.78019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.78070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.78090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.78127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.79966: stdout chunk (state=3): >>>ansible-tmp-1727204111.7749717-17174-197521405950605=/root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605 <<< 16142 1727204111.80083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.80135: stderr chunk (state=3): >>><<< 16142 1727204111.80139: stdout chunk (state=3): >>><<< 16142 1727204111.80155: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204111.7749717-17174-197521405950605=/root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204111.80195: variable 'ansible_module_compression' from source: unknown 16142 1727204111.80247: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16142 1727204111.80277: variable 'ansible_facts' from source: unknown 16142 1727204111.80341: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605/AnsiballZ_stat.py 16142 1727204111.80450: Sending initial data 16142 1727204111.80459: Sent initial data (153 bytes) 16142 1727204111.81128: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.81131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.81169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.81174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.81176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.81227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.81231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.81280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.82991: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204111.83025: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204111.83068: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpohcnk46i /root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605/AnsiballZ_stat.py <<< 16142 1727204111.83102: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204111.83879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.83986: stderr chunk (state=3): >>><<< 16142 1727204111.83989: stdout chunk (state=3): >>><<< 16142 1727204111.84006: done transferring module to remote 16142 1727204111.84015: _low_level_execute_command(): starting 16142 1727204111.84020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605/ /root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605/AnsiballZ_stat.py && sleep 0' 16142 1727204111.84483: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.84486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.84522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204111.84526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204111.84529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.84585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.84588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.84633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204111.86334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204111.86389: stderr chunk (state=3): >>><<< 16142 1727204111.86392: stdout chunk (state=3): >>><<< 16142 1727204111.86407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204111.86411: _low_level_execute_command(): starting 16142 1727204111.86416: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605/AnsiballZ_stat.py && sleep 0' 16142 1727204111.86854: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204111.86869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204111.86889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204111.86903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204111.86917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204111.86956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204111.86971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204111.87036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204112.00142: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27590, "dev": 21, "nlink": 1, "atime": 1727204110.04993, "mtime": 1727204110.04993, "ctime": 1727204110.04993, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16142 1727204112.01103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204112.01169: stderr chunk (state=3): >>><<< 16142 1727204112.01173: stdout chunk (state=3): >>><<< 16142 1727204112.01189: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27590, "dev": 21, "nlink": 1, "atime": 1727204110.04993, "mtime": 1727204110.04993, "ctime": 1727204110.04993, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204112.01229: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204112.01245: _low_level_execute_command(): starting 16142 1727204112.01248: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204111.7749717-17174-197521405950605/ > /dev/null 2>&1 && sleep 0' 16142 1727204112.01720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204112.01738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204112.01751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.01773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.01817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204112.01829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204112.01882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204112.03655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204112.03710: stderr chunk (state=3): >>><<< 16142 1727204112.03715: stdout chunk (state=3): >>><<< 16142 1727204112.03733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204112.03737: handler run complete 16142 1727204112.03770: attempt loop complete, returning result 16142 1727204112.03773: _execute() done 16142 1727204112.03775: dumping result to json 16142 1727204112.03780: done dumping result, returning 16142 1727204112.03788: done running TaskExecutor() for managed-node2/TASK: Get stat for interface test2 [0affcd87-79f5-fddd-f6c7-000000000260] 16142 1727204112.03793: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000260 16142 1727204112.03903: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000260 16142 1727204112.03906: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204110.04993, "block_size": 4096, "blocks": 0, "ctime": 1727204110.04993, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27590, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204110.04993, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 16142 1727204112.04012: no more pending results, returning what we have 16142 1727204112.04016: results queue empty 16142 1727204112.04017: checking for any_errors_fatal 16142 1727204112.04018: done checking for any_errors_fatal 16142 1727204112.04020: checking for max_fail_percentage 16142 1727204112.04022: done checking for max_fail_percentage 16142 1727204112.04023: checking to see if all hosts have failed and the running result is not ok 16142 1727204112.04023: done checking to see if all hosts have failed 16142 1727204112.04024: getting the remaining hosts for this loop 16142 1727204112.04026: done getting the remaining hosts for this loop 16142 1727204112.04030: getting the next task for host managed-node2 16142 1727204112.04040: done getting next task for host managed-node2 16142 1727204112.04042: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 16142 1727204112.04045: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204112.04048: getting variables 16142 1727204112.04050: in VariableManager get_vars() 16142 1727204112.04098: Calling all_inventory to load vars for managed-node2 16142 1727204112.04101: Calling groups_inventory to load vars for managed-node2 16142 1727204112.04103: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.04111: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.04113: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.04115: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.04228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.04356: done with get_vars() 16142 1727204112.04368: done getting variables 16142 1727204112.04411: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204112.04503: variable 'interface' from source: task vars 16142 1727204112.04506: variable 'dhcp_interface2' from source: play vars 16142 1727204112.04551: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.310) 0:00:11.222 ***** 16142 1727204112.04577: entering _queue_task() for managed-node2/assert 16142 1727204112.04772: worker is 1 (out of 1 available) 16142 1727204112.04785: exiting _queue_task() for managed-node2/assert 16142 1727204112.04797: done queuing things up, now waiting for results queue to drain 16142 1727204112.04798: waiting for pending results... 16142 1727204112.04955: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test2' 16142 1727204112.05036: in run() - task 0affcd87-79f5-fddd-f6c7-00000000001c 16142 1727204112.05046: variable 'ansible_search_path' from source: unknown 16142 1727204112.05050: variable 'ansible_search_path' from source: unknown 16142 1727204112.05082: calling self._execute() 16142 1727204112.05153: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.05157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.05167: variable 'omit' from source: magic vars 16142 1727204112.05489: variable 'ansible_distribution_major_version' from source: facts 16142 1727204112.05498: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204112.05504: variable 'omit' from source: magic vars 16142 1727204112.05544: variable 'omit' from source: magic vars 16142 1727204112.05612: variable 'interface' from source: task vars 16142 1727204112.05616: variable 'dhcp_interface2' from source: play vars 16142 1727204112.05663: variable 'dhcp_interface2' from source: play vars 16142 1727204112.05678: variable 'omit' from source: magic vars 16142 1727204112.05709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204112.05739: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204112.05756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204112.05771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204112.05781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204112.05804: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204112.05807: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.05810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.05886: Set connection var ansible_timeout to 10 16142 1727204112.05889: Set connection var ansible_connection to ssh 16142 1727204112.05891: Set connection var ansible_shell_type to sh 16142 1727204112.05896: Set connection var ansible_shell_executable to /bin/sh 16142 1727204112.05903: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204112.05909: Set connection var ansible_pipelining to False 16142 1727204112.05926: variable 'ansible_shell_executable' from source: unknown 16142 1727204112.05929: variable 'ansible_connection' from source: unknown 16142 1727204112.05934: variable 'ansible_module_compression' from source: unknown 16142 1727204112.05936: variable 'ansible_shell_type' from source: unknown 16142 1727204112.05938: variable 'ansible_shell_executable' from source: unknown 16142 1727204112.05941: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.05943: variable 'ansible_pipelining' from source: unknown 16142 1727204112.05945: variable 'ansible_timeout' from source: unknown 16142 1727204112.05954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.06061: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204112.06067: variable 'omit' from source: magic vars 16142 1727204112.06072: starting attempt loop 16142 1727204112.06074: running the handler 16142 1727204112.06169: variable 'interface_stat' from source: set_fact 16142 1727204112.06181: Evaluated conditional (interface_stat.stat.exists): True 16142 1727204112.06186: handler run complete 16142 1727204112.06197: attempt loop complete, returning result 16142 1727204112.06200: _execute() done 16142 1727204112.06202: dumping result to json 16142 1727204112.06205: done dumping result, returning 16142 1727204112.06210: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test2' [0affcd87-79f5-fddd-f6c7-00000000001c] 16142 1727204112.06217: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000001c 16142 1727204112.06306: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000001c 16142 1727204112.06309: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204112.06372: no more pending results, returning what we have 16142 1727204112.06379: results queue empty 16142 1727204112.06380: checking for any_errors_fatal 16142 1727204112.06391: done checking for any_errors_fatal 16142 1727204112.06392: checking for max_fail_percentage 16142 1727204112.06393: done checking for max_fail_percentage 16142 1727204112.06394: checking to see if all hosts have failed and the running result is not ok 16142 1727204112.06395: done checking to see if all hosts have failed 16142 1727204112.06395: getting the remaining hosts for this loop 16142 1727204112.06396: done getting the remaining hosts for this loop 16142 1727204112.06399: getting the next task for host managed-node2 16142 1727204112.06406: done getting next task for host managed-node2 16142 1727204112.06408: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 16142 1727204112.06410: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204112.06413: getting variables 16142 1727204112.06414: in VariableManager get_vars() 16142 1727204112.06457: Calling all_inventory to load vars for managed-node2 16142 1727204112.06459: Calling groups_inventory to load vars for managed-node2 16142 1727204112.06461: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.06471: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.06473: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.06476: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.06621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.06737: done with get_vars() 16142 1727204112.06745: done getting variables 16142 1727204112.06786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.022) 0:00:11.244 ***** 16142 1727204112.06805: entering _queue_task() for managed-node2/command 16142 1727204112.06981: worker is 1 (out of 1 available) 16142 1727204112.06994: exiting _queue_task() for managed-node2/command 16142 1727204112.07006: done queuing things up, now waiting for results queue to drain 16142 1727204112.07008: waiting for pending results... 16142 1727204112.07153: running TaskExecutor() for managed-node2/TASK: Backup the /etc/resolv.conf for initscript 16142 1727204112.07214: in run() - task 0affcd87-79f5-fddd-f6c7-00000000001d 16142 1727204112.07225: variable 'ansible_search_path' from source: unknown 16142 1727204112.07258: calling self._execute() 16142 1727204112.07330: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.07339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.07344: variable 'omit' from source: magic vars 16142 1727204112.07601: variable 'ansible_distribution_major_version' from source: facts 16142 1727204112.07612: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204112.07691: variable 'network_provider' from source: set_fact 16142 1727204112.07699: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204112.07706: when evaluation is False, skipping this task 16142 1727204112.07709: _execute() done 16142 1727204112.07712: dumping result to json 16142 1727204112.07714: done dumping result, returning 16142 1727204112.07721: done running TaskExecutor() for managed-node2/TASK: Backup the /etc/resolv.conf for initscript [0affcd87-79f5-fddd-f6c7-00000000001d] 16142 1727204112.07726: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000001d 16142 1727204112.07814: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000001d 16142 1727204112.07817: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204112.07867: no more pending results, returning what we have 16142 1727204112.07871: results queue empty 16142 1727204112.07872: checking for any_errors_fatal 16142 1727204112.07876: done checking for any_errors_fatal 16142 1727204112.07877: checking for max_fail_percentage 16142 1727204112.07878: done checking for max_fail_percentage 16142 1727204112.07879: checking to see if all hosts have failed and the running result is not ok 16142 1727204112.07880: done checking to see if all hosts have failed 16142 1727204112.07881: getting the remaining hosts for this loop 16142 1727204112.07882: done getting the remaining hosts for this loop 16142 1727204112.07885: getting the next task for host managed-node2 16142 1727204112.07890: done getting next task for host managed-node2 16142 1727204112.07892: ^ task is: TASK: TEST Add Bond with 2 ports 16142 1727204112.07894: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204112.07896: getting variables 16142 1727204112.07898: in VariableManager get_vars() 16142 1727204112.07945: Calling all_inventory to load vars for managed-node2 16142 1727204112.07947: Calling groups_inventory to load vars for managed-node2 16142 1727204112.07949: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.07956: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.07957: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.07959: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.08069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.08185: done with get_vars() 16142 1727204112.08192: done getting variables 16142 1727204112.08229: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.014) 0:00:11.259 ***** 16142 1727204112.08248: entering _queue_task() for managed-node2/debug 16142 1727204112.08417: worker is 1 (out of 1 available) 16142 1727204112.08431: exiting _queue_task() for managed-node2/debug 16142 1727204112.08441: done queuing things up, now waiting for results queue to drain 16142 1727204112.08442: waiting for pending results... 16142 1727204112.08591: running TaskExecutor() for managed-node2/TASK: TEST Add Bond with 2 ports 16142 1727204112.08643: in run() - task 0affcd87-79f5-fddd-f6c7-00000000001e 16142 1727204112.08655: variable 'ansible_search_path' from source: unknown 16142 1727204112.08684: calling self._execute() 16142 1727204112.08745: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.08749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.08757: variable 'omit' from source: magic vars 16142 1727204112.09071: variable 'ansible_distribution_major_version' from source: facts 16142 1727204112.09081: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204112.09087: variable 'omit' from source: magic vars 16142 1727204112.09103: variable 'omit' from source: magic vars 16142 1727204112.09126: variable 'omit' from source: magic vars 16142 1727204112.09160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204112.09186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204112.09203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204112.09218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204112.09226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204112.09252: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204112.09259: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.09262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.09336: Set connection var ansible_timeout to 10 16142 1727204112.09339: Set connection var ansible_connection to ssh 16142 1727204112.09342: Set connection var ansible_shell_type to sh 16142 1727204112.09345: Set connection var ansible_shell_executable to /bin/sh 16142 1727204112.09350: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204112.09361: Set connection var ansible_pipelining to False 16142 1727204112.09380: variable 'ansible_shell_executable' from source: unknown 16142 1727204112.09384: variable 'ansible_connection' from source: unknown 16142 1727204112.09386: variable 'ansible_module_compression' from source: unknown 16142 1727204112.09389: variable 'ansible_shell_type' from source: unknown 16142 1727204112.09391: variable 'ansible_shell_executable' from source: unknown 16142 1727204112.09393: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.09396: variable 'ansible_pipelining' from source: unknown 16142 1727204112.09398: variable 'ansible_timeout' from source: unknown 16142 1727204112.09402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.09505: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204112.09513: variable 'omit' from source: magic vars 16142 1727204112.09518: starting attempt loop 16142 1727204112.09521: running the handler 16142 1727204112.09556: handler run complete 16142 1727204112.09570: attempt loop complete, returning result 16142 1727204112.09572: _execute() done 16142 1727204112.09577: dumping result to json 16142 1727204112.09579: done dumping result, returning 16142 1727204112.09584: done running TaskExecutor() for managed-node2/TASK: TEST Add Bond with 2 ports [0affcd87-79f5-fddd-f6c7-00000000001e] 16142 1727204112.09594: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000001e 16142 1727204112.09672: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000001e 16142 1727204112.09675: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ################################################## 16142 1727204112.09721: no more pending results, returning what we have 16142 1727204112.09725: results queue empty 16142 1727204112.09725: checking for any_errors_fatal 16142 1727204112.09731: done checking for any_errors_fatal 16142 1727204112.09731: checking for max_fail_percentage 16142 1727204112.09733: done checking for max_fail_percentage 16142 1727204112.09734: checking to see if all hosts have failed and the running result is not ok 16142 1727204112.09735: done checking to see if all hosts have failed 16142 1727204112.09736: getting the remaining hosts for this loop 16142 1727204112.09737: done getting the remaining hosts for this loop 16142 1727204112.09740: getting the next task for host managed-node2 16142 1727204112.09746: done getting next task for host managed-node2 16142 1727204112.09753: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204112.09756: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204112.09773: getting variables 16142 1727204112.09775: in VariableManager get_vars() 16142 1727204112.09819: Calling all_inventory to load vars for managed-node2 16142 1727204112.09822: Calling groups_inventory to load vars for managed-node2 16142 1727204112.09824: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.09832: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.09834: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.09837: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.09981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.10103: done with get_vars() 16142 1727204112.10110: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.019) 0:00:11.278 ***** 16142 1727204112.10178: entering _queue_task() for managed-node2/include_tasks 16142 1727204112.10358: worker is 1 (out of 1 available) 16142 1727204112.10373: exiting _queue_task() for managed-node2/include_tasks 16142 1727204112.10384: done queuing things up, now waiting for results queue to drain 16142 1727204112.10386: waiting for pending results... 16142 1727204112.10542: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204112.10631: in run() - task 0affcd87-79f5-fddd-f6c7-000000000026 16142 1727204112.10644: variable 'ansible_search_path' from source: unknown 16142 1727204112.10648: variable 'ansible_search_path' from source: unknown 16142 1727204112.10681: calling self._execute() 16142 1727204112.10739: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.10743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.10751: variable 'omit' from source: magic vars 16142 1727204112.11019: variable 'ansible_distribution_major_version' from source: facts 16142 1727204112.11029: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204112.11036: _execute() done 16142 1727204112.11041: dumping result to json 16142 1727204112.11043: done dumping result, returning 16142 1727204112.11050: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-fddd-f6c7-000000000026] 16142 1727204112.11056: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000026 16142 1727204112.11142: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000026 16142 1727204112.11145: WORKER PROCESS EXITING 16142 1727204112.11190: no more pending results, returning what we have 16142 1727204112.11195: in VariableManager get_vars() 16142 1727204112.11250: Calling all_inventory to load vars for managed-node2 16142 1727204112.11260: Calling groups_inventory to load vars for managed-node2 16142 1727204112.11262: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.11274: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.11277: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.11279: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.11397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.11517: done with get_vars() 16142 1727204112.11523: variable 'ansible_search_path' from source: unknown 16142 1727204112.11523: variable 'ansible_search_path' from source: unknown 16142 1727204112.11552: we have included files to process 16142 1727204112.11552: generating all_blocks data 16142 1727204112.11554: done generating all_blocks data 16142 1727204112.11557: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204112.11558: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204112.11559: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204112.12053: done processing included file 16142 1727204112.12054: iterating over new_blocks loaded from include file 16142 1727204112.12055: in VariableManager get_vars() 16142 1727204112.12078: done with get_vars() 16142 1727204112.12079: filtering new block on tags 16142 1727204112.12090: done filtering new block on tags 16142 1727204112.12092: in VariableManager get_vars() 16142 1727204112.12109: done with get_vars() 16142 1727204112.12110: filtering new block on tags 16142 1727204112.12122: done filtering new block on tags 16142 1727204112.12123: in VariableManager get_vars() 16142 1727204112.12144: done with get_vars() 16142 1727204112.12146: filtering new block on tags 16142 1727204112.12157: done filtering new block on tags 16142 1727204112.12158: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 16142 1727204112.12162: extending task lists for all hosts with included blocks 16142 1727204112.12652: done extending task lists 16142 1727204112.12653: done processing included files 16142 1727204112.12654: results queue empty 16142 1727204112.12654: checking for any_errors_fatal 16142 1727204112.12656: done checking for any_errors_fatal 16142 1727204112.12657: checking for max_fail_percentage 16142 1727204112.12658: done checking for max_fail_percentage 16142 1727204112.12658: checking to see if all hosts have failed and the running result is not ok 16142 1727204112.12659: done checking to see if all hosts have failed 16142 1727204112.12659: getting the remaining hosts for this loop 16142 1727204112.12660: done getting the remaining hosts for this loop 16142 1727204112.12662: getting the next task for host managed-node2 16142 1727204112.12666: done getting next task for host managed-node2 16142 1727204112.12669: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204112.12671: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204112.12678: getting variables 16142 1727204112.12679: in VariableManager get_vars() 16142 1727204112.12695: Calling all_inventory to load vars for managed-node2 16142 1727204112.12696: Calling groups_inventory to load vars for managed-node2 16142 1727204112.12697: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.12701: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.12703: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.12704: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.12806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.12928: done with get_vars() 16142 1727204112.12936: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.028) 0:00:11.306 ***** 16142 1727204112.12986: entering _queue_task() for managed-node2/setup 16142 1727204112.13206: worker is 1 (out of 1 available) 16142 1727204112.13220: exiting _queue_task() for managed-node2/setup 16142 1727204112.13232: done queuing things up, now waiting for results queue to drain 16142 1727204112.13234: waiting for pending results... 16142 1727204112.13405: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204112.13499: in run() - task 0affcd87-79f5-fddd-f6c7-00000000027e 16142 1727204112.13509: variable 'ansible_search_path' from source: unknown 16142 1727204112.13512: variable 'ansible_search_path' from source: unknown 16142 1727204112.13544: calling self._execute() 16142 1727204112.13607: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.13611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.13620: variable 'omit' from source: magic vars 16142 1727204112.13889: variable 'ansible_distribution_major_version' from source: facts 16142 1727204112.13902: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204112.14051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204112.15615: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204112.15661: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204112.15689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204112.15716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204112.15737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204112.15799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204112.15818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204112.15843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204112.15873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204112.15884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204112.15921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204112.15940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204112.15966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204112.15995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204112.16006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204112.16119: variable '__network_required_facts' from source: role '' defaults 16142 1727204112.16126: variable 'ansible_facts' from source: unknown 16142 1727204112.16192: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 16142 1727204112.16195: when evaluation is False, skipping this task 16142 1727204112.16198: _execute() done 16142 1727204112.16201: dumping result to json 16142 1727204112.16203: done dumping result, returning 16142 1727204112.16209: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-fddd-f6c7-00000000027e] 16142 1727204112.16214: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000027e 16142 1727204112.16304: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000027e 16142 1727204112.16307: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204112.16351: no more pending results, returning what we have 16142 1727204112.16355: results queue empty 16142 1727204112.16356: checking for any_errors_fatal 16142 1727204112.16357: done checking for any_errors_fatal 16142 1727204112.16358: checking for max_fail_percentage 16142 1727204112.16360: done checking for max_fail_percentage 16142 1727204112.16361: checking to see if all hosts have failed and the running result is not ok 16142 1727204112.16362: done checking to see if all hosts have failed 16142 1727204112.16362: getting the remaining hosts for this loop 16142 1727204112.16365: done getting the remaining hosts for this loop 16142 1727204112.16369: getting the next task for host managed-node2 16142 1727204112.16378: done getting next task for host managed-node2 16142 1727204112.16387: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204112.16391: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204112.16406: getting variables 16142 1727204112.16407: in VariableManager get_vars() 16142 1727204112.16461: Calling all_inventory to load vars for managed-node2 16142 1727204112.16466: Calling groups_inventory to load vars for managed-node2 16142 1727204112.16468: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.16476: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.16479: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.16481: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.16612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.16743: done with get_vars() 16142 1727204112.16752: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.038) 0:00:11.345 ***** 16142 1727204112.16826: entering _queue_task() for managed-node2/stat 16142 1727204112.17023: worker is 1 (out of 1 available) 16142 1727204112.17039: exiting _queue_task() for managed-node2/stat 16142 1727204112.17051: done queuing things up, now waiting for results queue to drain 16142 1727204112.17053: waiting for pending results... 16142 1727204112.17213: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204112.17303: in run() - task 0affcd87-79f5-fddd-f6c7-000000000280 16142 1727204112.17314: variable 'ansible_search_path' from source: unknown 16142 1727204112.17317: variable 'ansible_search_path' from source: unknown 16142 1727204112.17346: calling self._execute() 16142 1727204112.17459: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.17463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.17475: variable 'omit' from source: magic vars 16142 1727204112.17727: variable 'ansible_distribution_major_version' from source: facts 16142 1727204112.17738: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204112.17851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204112.18038: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204112.18071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204112.18100: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204112.18130: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204112.18197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204112.18215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204112.18236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204112.18257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204112.18319: variable '__network_is_ostree' from source: set_fact 16142 1727204112.18325: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204112.18328: when evaluation is False, skipping this task 16142 1727204112.18330: _execute() done 16142 1727204112.18335: dumping result to json 16142 1727204112.18338: done dumping result, returning 16142 1727204112.18349: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-fddd-f6c7-000000000280] 16142 1727204112.18355: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000280 16142 1727204112.18436: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000280 16142 1727204112.18439: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204112.18586: no more pending results, returning what we have 16142 1727204112.18589: results queue empty 16142 1727204112.18590: checking for any_errors_fatal 16142 1727204112.18593: done checking for any_errors_fatal 16142 1727204112.18593: checking for max_fail_percentage 16142 1727204112.18595: done checking for max_fail_percentage 16142 1727204112.18596: checking to see if all hosts have failed and the running result is not ok 16142 1727204112.18596: done checking to see if all hosts have failed 16142 1727204112.18597: getting the remaining hosts for this loop 16142 1727204112.18598: done getting the remaining hosts for this loop 16142 1727204112.18601: getting the next task for host managed-node2 16142 1727204112.18606: done getting next task for host managed-node2 16142 1727204112.18609: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204112.18613: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204112.18623: getting variables 16142 1727204112.18624: in VariableManager get_vars() 16142 1727204112.18654: Calling all_inventory to load vars for managed-node2 16142 1727204112.18656: Calling groups_inventory to load vars for managed-node2 16142 1727204112.18657: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.18666: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.18669: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.18671: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.18769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.18898: done with get_vars() 16142 1727204112.18906: done getting variables 16142 1727204112.18945: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.021) 0:00:11.366 ***** 16142 1727204112.18972: entering _queue_task() for managed-node2/set_fact 16142 1727204112.19158: worker is 1 (out of 1 available) 16142 1727204112.19172: exiting _queue_task() for managed-node2/set_fact 16142 1727204112.19183: done queuing things up, now waiting for results queue to drain 16142 1727204112.19185: waiting for pending results... 16142 1727204112.19341: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204112.19435: in run() - task 0affcd87-79f5-fddd-f6c7-000000000281 16142 1727204112.19446: variable 'ansible_search_path' from source: unknown 16142 1727204112.19450: variable 'ansible_search_path' from source: unknown 16142 1727204112.19483: calling self._execute() 16142 1727204112.19546: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.19550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.19558: variable 'omit' from source: magic vars 16142 1727204112.19820: variable 'ansible_distribution_major_version' from source: facts 16142 1727204112.19831: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204112.19946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204112.20138: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204112.20191: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204112.20217: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204112.20244: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204112.20308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204112.20327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204112.20346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204112.20366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204112.20431: variable '__network_is_ostree' from source: set_fact 16142 1727204112.20437: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204112.20440: when evaluation is False, skipping this task 16142 1727204112.20442: _execute() done 16142 1727204112.20445: dumping result to json 16142 1727204112.20447: done dumping result, returning 16142 1727204112.20452: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-fddd-f6c7-000000000281] 16142 1727204112.20457: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000281 16142 1727204112.20542: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000281 16142 1727204112.20545: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204112.20593: no more pending results, returning what we have 16142 1727204112.20597: results queue empty 16142 1727204112.20598: checking for any_errors_fatal 16142 1727204112.20602: done checking for any_errors_fatal 16142 1727204112.20603: checking for max_fail_percentage 16142 1727204112.20605: done checking for max_fail_percentage 16142 1727204112.20606: checking to see if all hosts have failed and the running result is not ok 16142 1727204112.20607: done checking to see if all hosts have failed 16142 1727204112.20607: getting the remaining hosts for this loop 16142 1727204112.20609: done getting the remaining hosts for this loop 16142 1727204112.20612: getting the next task for host managed-node2 16142 1727204112.20619: done getting next task for host managed-node2 16142 1727204112.20623: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204112.20627: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204112.20642: getting variables 16142 1727204112.20643: in VariableManager get_vars() 16142 1727204112.20693: Calling all_inventory to load vars for managed-node2 16142 1727204112.20696: Calling groups_inventory to load vars for managed-node2 16142 1727204112.20697: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204112.20704: Calling all_plugins_play to load vars for managed-node2 16142 1727204112.20705: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204112.20707: Calling groups_plugins_play to load vars for managed-node2 16142 1727204112.20818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204112.20966: done with get_vars() 16142 1727204112.20976: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.020) 0:00:11.387 ***** 16142 1727204112.21044: entering _queue_task() for managed-node2/service_facts 16142 1727204112.21045: Creating lock for service_facts 16142 1727204112.21236: worker is 1 (out of 1 available) 16142 1727204112.21249: exiting _queue_task() for managed-node2/service_facts 16142 1727204112.21263: done queuing things up, now waiting for results queue to drain 16142 1727204112.21265: waiting for pending results... 16142 1727204112.21421: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204112.21508: in run() - task 0affcd87-79f5-fddd-f6c7-000000000283 16142 1727204112.21519: variable 'ansible_search_path' from source: unknown 16142 1727204112.21522: variable 'ansible_search_path' from source: unknown 16142 1727204112.21552: calling self._execute() 16142 1727204112.21613: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.21617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.21626: variable 'omit' from source: magic vars 16142 1727204112.21883: variable 'ansible_distribution_major_version' from source: facts 16142 1727204112.21893: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204112.21899: variable 'omit' from source: magic vars 16142 1727204112.21947: variable 'omit' from source: magic vars 16142 1727204112.21972: variable 'omit' from source: magic vars 16142 1727204112.22004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204112.22031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204112.22047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204112.22060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204112.22071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204112.22097: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204112.22100: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.22102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.22172: Set connection var ansible_timeout to 10 16142 1727204112.22176: Set connection var ansible_connection to ssh 16142 1727204112.22178: Set connection var ansible_shell_type to sh 16142 1727204112.22185: Set connection var ansible_shell_executable to /bin/sh 16142 1727204112.22189: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204112.22201: Set connection var ansible_pipelining to False 16142 1727204112.22217: variable 'ansible_shell_executable' from source: unknown 16142 1727204112.22220: variable 'ansible_connection' from source: unknown 16142 1727204112.22222: variable 'ansible_module_compression' from source: unknown 16142 1727204112.22225: variable 'ansible_shell_type' from source: unknown 16142 1727204112.22227: variable 'ansible_shell_executable' from source: unknown 16142 1727204112.22229: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204112.22233: variable 'ansible_pipelining' from source: unknown 16142 1727204112.22238: variable 'ansible_timeout' from source: unknown 16142 1727204112.22242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204112.22392: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204112.22399: variable 'omit' from source: magic vars 16142 1727204112.22408: starting attempt loop 16142 1727204112.22412: running the handler 16142 1727204112.22421: _low_level_execute_command(): starting 16142 1727204112.22428: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204112.22957: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204112.22979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204112.22993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204112.23005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.23051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204112.23074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204112.23130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204112.24812: stdout chunk (state=3): >>>/root <<< 16142 1727204112.24957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204112.25012: stderr chunk (state=3): >>><<< 16142 1727204112.25022: stdout chunk (state=3): >>><<< 16142 1727204112.25050: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204112.25072: _low_level_execute_command(): starting 16142 1727204112.25082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536 `" && echo ansible-tmp-1727204112.250569-17189-278636979818536="` echo /root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536 `" ) && sleep 0' 16142 1727204112.25939: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204112.25942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204112.25945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.25947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204112.25957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204112.25959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.26015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204112.26045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204112.26075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204112.26162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204112.28093: stdout chunk (state=3): >>>ansible-tmp-1727204112.250569-17189-278636979818536=/root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536 <<< 16142 1727204112.28201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204112.28262: stderr chunk (state=3): >>><<< 16142 1727204112.28267: stdout chunk (state=3): >>><<< 16142 1727204112.28284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204112.250569-17189-278636979818536=/root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204112.28327: variable 'ansible_module_compression' from source: unknown 16142 1727204112.28365: ANSIBALLZ: Using lock for service_facts 16142 1727204112.28369: ANSIBALLZ: Acquiring lock 16142 1727204112.28371: ANSIBALLZ: Lock acquired: 140089292552240 16142 1727204112.28374: ANSIBALLZ: Creating module 16142 1727204112.37925: ANSIBALLZ: Writing module into payload 16142 1727204112.38008: ANSIBALLZ: Writing module 16142 1727204112.38034: ANSIBALLZ: Renaming module 16142 1727204112.38038: ANSIBALLZ: Done creating module 16142 1727204112.38051: variable 'ansible_facts' from source: unknown 16142 1727204112.38102: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536/AnsiballZ_service_facts.py 16142 1727204112.38218: Sending initial data 16142 1727204112.38222: Sent initial data (161 bytes) 16142 1727204112.38945: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204112.38952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204112.38987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.39000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.39049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204112.39061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204112.39073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204112.39135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204112.40987: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204112.41022: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204112.41059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpoa64w7hc /root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536/AnsiballZ_service_facts.py <<< 16142 1727204112.41098: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204112.41930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204112.42050: stderr chunk (state=3): >>><<< 16142 1727204112.42054: stdout chunk (state=3): >>><<< 16142 1727204112.42074: done transferring module to remote 16142 1727204112.42082: _low_level_execute_command(): starting 16142 1727204112.42087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536/ /root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536/AnsiballZ_service_facts.py && sleep 0' 16142 1727204112.42566: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204112.42580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204112.42596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.42612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.42658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204112.42676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204112.42723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204112.44502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204112.44567: stderr chunk (state=3): >>><<< 16142 1727204112.44571: stdout chunk (state=3): >>><<< 16142 1727204112.44585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204112.44592: _low_level_execute_command(): starting 16142 1727204112.44597: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536/AnsiballZ_service_facts.py && sleep 0' 16142 1727204112.45069: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204112.45085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204112.45097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204112.45108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204112.45118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204112.45173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204112.45194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204112.45237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204113.78826: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status":<<< 16142 1727204113.78841: stdout chunk (state=3): >>> "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "sourc<<< 16142 1727204113.78846: stdout chunk (state=3): >>>e": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.servic<<< 16142 1727204113.78852: stdout chunk (state=3): >>>e", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-au<<< 16142 1727204113.78860: stdout chunk (state=3): >>>tofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 16142 1727204113.80182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204113.80253: stderr chunk (state=3): >>><<< 16142 1727204113.80257: stdout chunk (state=3): >>><<< 16142 1727204113.80481: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204113.80772: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204113.80780: _low_level_execute_command(): starting 16142 1727204113.80784: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204112.250569-17189-278636979818536/ > /dev/null 2>&1 && sleep 0' 16142 1727204113.81239: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204113.81251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204113.81263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204113.81276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204113.81285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204113.81327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204113.81350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204113.81390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204113.83180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204113.83269: stderr chunk (state=3): >>><<< 16142 1727204113.83272: stdout chunk (state=3): >>><<< 16142 1727204113.83870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204113.83874: handler run complete 16142 1727204113.83876: variable 'ansible_facts' from source: unknown 16142 1727204113.83879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204113.84057: variable 'ansible_facts' from source: unknown 16142 1727204113.84187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204113.84377: attempt loop complete, returning result 16142 1727204113.84389: _execute() done 16142 1727204113.84395: dumping result to json 16142 1727204113.84455: done dumping result, returning 16142 1727204113.84480: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-fddd-f6c7-000000000283] 16142 1727204113.84494: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000283 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204113.85451: no more pending results, returning what we have 16142 1727204113.85454: results queue empty 16142 1727204113.85455: checking for any_errors_fatal 16142 1727204113.85458: done checking for any_errors_fatal 16142 1727204113.85459: checking for max_fail_percentage 16142 1727204113.85461: done checking for max_fail_percentage 16142 1727204113.85461: checking to see if all hosts have failed and the running result is not ok 16142 1727204113.85462: done checking to see if all hosts have failed 16142 1727204113.85463: getting the remaining hosts for this loop 16142 1727204113.85466: done getting the remaining hosts for this loop 16142 1727204113.85470: getting the next task for host managed-node2 16142 1727204113.85476: done getting next task for host managed-node2 16142 1727204113.85480: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204113.85484: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204113.85493: getting variables 16142 1727204113.85495: in VariableManager get_vars() 16142 1727204113.85542: Calling all_inventory to load vars for managed-node2 16142 1727204113.85545: Calling groups_inventory to load vars for managed-node2 16142 1727204113.85547: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204113.85557: Calling all_plugins_play to load vars for managed-node2 16142 1727204113.85559: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204113.85562: Calling groups_plugins_play to load vars for managed-node2 16142 1727204113.85902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204113.86387: done with get_vars() 16142 1727204113.86400: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:13 -0400 (0:00:01.654) 0:00:13.041 ***** 16142 1727204113.86506: entering _queue_task() for managed-node2/package_facts 16142 1727204113.86507: Creating lock for package_facts 16142 1727204113.86898: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000283 16142 1727204113.86907: WORKER PROCESS EXITING 16142 1727204113.87270: worker is 1 (out of 1 available) 16142 1727204113.87283: exiting _queue_task() for managed-node2/package_facts 16142 1727204113.87294: done queuing things up, now waiting for results queue to drain 16142 1727204113.87296: waiting for pending results... 16142 1727204113.87572: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204113.87740: in run() - task 0affcd87-79f5-fddd-f6c7-000000000284 16142 1727204113.87763: variable 'ansible_search_path' from source: unknown 16142 1727204113.87775: variable 'ansible_search_path' from source: unknown 16142 1727204113.87816: calling self._execute() 16142 1727204113.87908: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204113.87922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204113.87939: variable 'omit' from source: magic vars 16142 1727204113.88331: variable 'ansible_distribution_major_version' from source: facts 16142 1727204113.88353: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204113.88366: variable 'omit' from source: magic vars 16142 1727204113.88476: variable 'omit' from source: magic vars 16142 1727204113.88522: variable 'omit' from source: magic vars 16142 1727204113.88571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204113.88613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204113.88648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204113.88673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204113.88691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204113.88731: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204113.88741: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204113.88749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204113.88863: Set connection var ansible_timeout to 10 16142 1727204113.88873: Set connection var ansible_connection to ssh 16142 1727204113.88883: Set connection var ansible_shell_type to sh 16142 1727204113.88893: Set connection var ansible_shell_executable to /bin/sh 16142 1727204113.88902: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204113.88914: Set connection var ansible_pipelining to False 16142 1727204113.88942: variable 'ansible_shell_executable' from source: unknown 16142 1727204113.88954: variable 'ansible_connection' from source: unknown 16142 1727204113.88962: variable 'ansible_module_compression' from source: unknown 16142 1727204113.88973: variable 'ansible_shell_type' from source: unknown 16142 1727204113.88981: variable 'ansible_shell_executable' from source: unknown 16142 1727204113.88989: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204113.88997: variable 'ansible_pipelining' from source: unknown 16142 1727204113.89003: variable 'ansible_timeout' from source: unknown 16142 1727204113.89010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204113.89212: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204113.89229: variable 'omit' from source: magic vars 16142 1727204113.89238: starting attempt loop 16142 1727204113.89245: running the handler 16142 1727204113.89262: _low_level_execute_command(): starting 16142 1727204113.89279: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204113.90029: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204113.90049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204113.90065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204113.90083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204113.90124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204113.90136: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204113.90153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204113.90171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204113.90181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204113.90191: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204113.90203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204113.90216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204113.90232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204113.90247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204113.90261: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204113.90277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204113.90344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204113.90373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204113.90391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204113.90463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204113.92096: stdout chunk (state=3): >>>/root <<< 16142 1727204113.92196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204113.92304: stderr chunk (state=3): >>><<< 16142 1727204113.92320: stdout chunk (state=3): >>><<< 16142 1727204113.92469: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204113.92474: _low_level_execute_command(): starting 16142 1727204113.92476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153 `" && echo ansible-tmp-1727204113.9235969-17292-66368351943153="` echo /root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153 `" ) && sleep 0' 16142 1727204113.93099: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204113.93122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204113.93139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204113.93158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204113.93203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204113.93216: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204113.93240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204113.93258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204113.93271: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204113.93282: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204113.93294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204113.93308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204113.93324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204113.93345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204113.93358: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204113.93375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204113.93459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204113.93484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204113.93500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204113.93583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204113.95447: stdout chunk (state=3): >>>ansible-tmp-1727204113.9235969-17292-66368351943153=/root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153 <<< 16142 1727204113.95563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204113.95665: stderr chunk (state=3): >>><<< 16142 1727204113.95681: stdout chunk (state=3): >>><<< 16142 1727204113.95772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204113.9235969-17292-66368351943153=/root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204113.95975: variable 'ansible_module_compression' from source: unknown 16142 1727204113.95978: ANSIBALLZ: Using lock for package_facts 16142 1727204113.95980: ANSIBALLZ: Acquiring lock 16142 1727204113.95982: ANSIBALLZ: Lock acquired: 140089291474800 16142 1727204113.95984: ANSIBALLZ: Creating module 16142 1727204114.38400: ANSIBALLZ: Writing module into payload 16142 1727204114.38593: ANSIBALLZ: Writing module 16142 1727204114.38626: ANSIBALLZ: Renaming module 16142 1727204114.38632: ANSIBALLZ: Done creating module 16142 1727204114.38656: variable 'ansible_facts' from source: unknown 16142 1727204114.38830: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153/AnsiballZ_package_facts.py 16142 1727204114.38986: Sending initial data 16142 1727204114.38989: Sent initial data (161 bytes) 16142 1727204114.39932: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204114.39950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204114.39955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204114.39972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204114.40008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204114.40016: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204114.40034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204114.40054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204114.40057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204114.40060: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204114.40062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204114.40086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204114.40088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204114.40094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204114.40112: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204114.40116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204114.40187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204114.40202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204114.40205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204114.40317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204114.42172: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204114.42178: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204114.42240: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpqhy2i6g8 /root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153/AnsiballZ_package_facts.py <<< 16142 1727204114.42305: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204114.45174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204114.45269: stderr chunk (state=3): >>><<< 16142 1727204114.45295: stdout chunk (state=3): >>><<< 16142 1727204114.45299: done transferring module to remote 16142 1727204114.45320: _low_level_execute_command(): starting 16142 1727204114.45333: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153/ /root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153/AnsiballZ_package_facts.py && sleep 0' 16142 1727204114.46418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204114.46422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204114.46459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204114.46462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204114.46467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204114.46469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204114.46514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204114.46520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204114.46577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204114.48384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204114.48388: stdout chunk (state=3): >>><<< 16142 1727204114.48390: stderr chunk (state=3): >>><<< 16142 1727204114.48410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204114.48414: _low_level_execute_command(): starting 16142 1727204114.48418: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153/AnsiballZ_package_facts.py && sleep 0' 16142 1727204114.49189: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204114.49205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204114.49220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204114.49301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204114.49305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16142 1727204114.49323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204114.49337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204114.49444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204114.49474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204114.49548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204114.96192: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "lib<<< 16142 1727204114.96204: stdout chunk (state=3): >>>xml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.<<< 16142 1727204114.96213: stdout chunk (state=3): >>>37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "r<<< 16142 1727204114.96222: stdout chunk (state=3): >>>elease": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "n<<< 16142 1727204114.96227: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap"<<< 16142 1727204114.96234: stdout chunk (state=3): >>>: [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 16142 1727204114.97772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204114.97776: stdout chunk (state=3): >>><<< 16142 1727204114.97778: stderr chunk (state=3): >>><<< 16142 1727204114.97869: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204115.02081: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204115.02098: _low_level_execute_command(): starting 16142 1727204115.02101: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204113.9235969-17292-66368351943153/ > /dev/null 2>&1 && sleep 0' 16142 1727204115.02870: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204115.04757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204115.04788: stdout chunk (state=3): >>><<< 16142 1727204115.04801: stderr chunk (state=3): >>><<< 16142 1727204115.04821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204115.04835: handler run complete 16142 1727204115.05981: variable 'ansible_facts' from source: unknown 16142 1727204115.06599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.09368: variable 'ansible_facts' from source: unknown 16142 1727204115.09900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.10915: attempt loop complete, returning result 16142 1727204115.10940: _execute() done 16142 1727204115.10948: dumping result to json 16142 1727204115.11208: done dumping result, returning 16142 1727204115.11226: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-fddd-f6c7-000000000284] 16142 1727204115.11241: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000284 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204115.13648: no more pending results, returning what we have 16142 1727204115.13652: results queue empty 16142 1727204115.13653: checking for any_errors_fatal 16142 1727204115.13660: done checking for any_errors_fatal 16142 1727204115.13661: checking for max_fail_percentage 16142 1727204115.13663: done checking for max_fail_percentage 16142 1727204115.13666: checking to see if all hosts have failed and the running result is not ok 16142 1727204115.13667: done checking to see if all hosts have failed 16142 1727204115.13667: getting the remaining hosts for this loop 16142 1727204115.13669: done getting the remaining hosts for this loop 16142 1727204115.13673: getting the next task for host managed-node2 16142 1727204115.13681: done getting next task for host managed-node2 16142 1727204115.13685: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204115.13689: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204115.13700: getting variables 16142 1727204115.13702: in VariableManager get_vars() 16142 1727204115.13755: Calling all_inventory to load vars for managed-node2 16142 1727204115.13758: Calling groups_inventory to load vars for managed-node2 16142 1727204115.13760: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204115.13773: Calling all_plugins_play to load vars for managed-node2 16142 1727204115.13776: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204115.13779: Calling groups_plugins_play to load vars for managed-node2 16142 1727204115.14858: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000284 16142 1727204115.14862: WORKER PROCESS EXITING 16142 1727204115.15849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.17822: done with get_vars() 16142 1727204115.17858: done getting variables 16142 1727204115.17935: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:15 -0400 (0:00:01.314) 0:00:14.356 ***** 16142 1727204115.17972: entering _queue_task() for managed-node2/debug 16142 1727204115.18480: worker is 1 (out of 1 available) 16142 1727204115.18492: exiting _queue_task() for managed-node2/debug 16142 1727204115.18502: done queuing things up, now waiting for results queue to drain 16142 1727204115.18503: waiting for pending results... 16142 1727204115.18803: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204115.18961: in run() - task 0affcd87-79f5-fddd-f6c7-000000000027 16142 1727204115.18988: variable 'ansible_search_path' from source: unknown 16142 1727204115.19039: variable 'ansible_search_path' from source: unknown 16142 1727204115.19088: calling self._execute() 16142 1727204115.19323: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.19339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.19355: variable 'omit' from source: magic vars 16142 1727204115.20344: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.20489: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204115.20502: variable 'omit' from source: magic vars 16142 1727204115.20771: variable 'omit' from source: magic vars 16142 1727204115.21002: variable 'network_provider' from source: set_fact 16142 1727204115.21246: variable 'omit' from source: magic vars 16142 1727204115.21359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204115.21452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204115.21513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204115.21539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204115.21557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204115.21602: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204115.21620: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.21634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.21758: Set connection var ansible_timeout to 10 16142 1727204115.21769: Set connection var ansible_connection to ssh 16142 1727204115.21781: Set connection var ansible_shell_type to sh 16142 1727204115.21791: Set connection var ansible_shell_executable to /bin/sh 16142 1727204115.21800: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204115.21819: Set connection var ansible_pipelining to False 16142 1727204115.21851: variable 'ansible_shell_executable' from source: unknown 16142 1727204115.21861: variable 'ansible_connection' from source: unknown 16142 1727204115.21871: variable 'ansible_module_compression' from source: unknown 16142 1727204115.21878: variable 'ansible_shell_type' from source: unknown 16142 1727204115.21884: variable 'ansible_shell_executable' from source: unknown 16142 1727204115.21889: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.21896: variable 'ansible_pipelining' from source: unknown 16142 1727204115.21902: variable 'ansible_timeout' from source: unknown 16142 1727204115.21908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.22074: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204115.22092: variable 'omit' from source: magic vars 16142 1727204115.22126: starting attempt loop 16142 1727204115.22145: running the handler 16142 1727204115.22194: handler run complete 16142 1727204115.22214: attempt loop complete, returning result 16142 1727204115.22222: _execute() done 16142 1727204115.22228: dumping result to json 16142 1727204115.22238: done dumping result, returning 16142 1727204115.22259: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-fddd-f6c7-000000000027] 16142 1727204115.22272: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000027 ok: [managed-node2] => {} MSG: Using network provider: nm 16142 1727204115.22445: no more pending results, returning what we have 16142 1727204115.22449: results queue empty 16142 1727204115.22450: checking for any_errors_fatal 16142 1727204115.22459: done checking for any_errors_fatal 16142 1727204115.22460: checking for max_fail_percentage 16142 1727204115.22462: done checking for max_fail_percentage 16142 1727204115.22463: checking to see if all hosts have failed and the running result is not ok 16142 1727204115.22465: done checking to see if all hosts have failed 16142 1727204115.22466: getting the remaining hosts for this loop 16142 1727204115.22467: done getting the remaining hosts for this loop 16142 1727204115.22472: getting the next task for host managed-node2 16142 1727204115.22479: done getting next task for host managed-node2 16142 1727204115.22484: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204115.22488: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204115.22502: getting variables 16142 1727204115.22504: in VariableManager get_vars() 16142 1727204115.22571: Calling all_inventory to load vars for managed-node2 16142 1727204115.22574: Calling groups_inventory to load vars for managed-node2 16142 1727204115.22577: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204115.22587: Calling all_plugins_play to load vars for managed-node2 16142 1727204115.22590: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204115.22594: Calling groups_plugins_play to load vars for managed-node2 16142 1727204115.24748: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000027 16142 1727204115.24753: WORKER PROCESS EXITING 16142 1727204115.26892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.29062: done with get_vars() 16142 1727204115.29136: done getting variables 16142 1727204115.29395: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.114) 0:00:14.471 ***** 16142 1727204115.29429: entering _queue_task() for managed-node2/fail 16142 1727204115.29430: Creating lock for fail 16142 1727204115.29770: worker is 1 (out of 1 available) 16142 1727204115.29784: exiting _queue_task() for managed-node2/fail 16142 1727204115.29797: done queuing things up, now waiting for results queue to drain 16142 1727204115.29798: waiting for pending results... 16142 1727204115.30773: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204115.31030: in run() - task 0affcd87-79f5-fddd-f6c7-000000000028 16142 1727204115.31056: variable 'ansible_search_path' from source: unknown 16142 1727204115.31066: variable 'ansible_search_path' from source: unknown 16142 1727204115.31109: calling self._execute() 16142 1727204115.31227: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.31330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.31348: variable 'omit' from source: magic vars 16142 1727204115.32438: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.32455: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204115.32578: variable 'network_state' from source: role '' defaults 16142 1727204115.32596: Evaluated conditional (network_state != {}): False 16142 1727204115.32604: when evaluation is False, skipping this task 16142 1727204115.32611: _execute() done 16142 1727204115.32618: dumping result to json 16142 1727204115.32625: done dumping result, returning 16142 1727204115.32636: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-fddd-f6c7-000000000028] 16142 1727204115.32652: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000028 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204115.32813: no more pending results, returning what we have 16142 1727204115.32818: results queue empty 16142 1727204115.32819: checking for any_errors_fatal 16142 1727204115.32827: done checking for any_errors_fatal 16142 1727204115.32828: checking for max_fail_percentage 16142 1727204115.32830: done checking for max_fail_percentage 16142 1727204115.32831: checking to see if all hosts have failed and the running result is not ok 16142 1727204115.32832: done checking to see if all hosts have failed 16142 1727204115.32833: getting the remaining hosts for this loop 16142 1727204115.32834: done getting the remaining hosts for this loop 16142 1727204115.32838: getting the next task for host managed-node2 16142 1727204115.32845: done getting next task for host managed-node2 16142 1727204115.32851: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204115.32854: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204115.32873: getting variables 16142 1727204115.32876: in VariableManager get_vars() 16142 1727204115.32938: Calling all_inventory to load vars for managed-node2 16142 1727204115.32942: Calling groups_inventory to load vars for managed-node2 16142 1727204115.32945: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204115.32958: Calling all_plugins_play to load vars for managed-node2 16142 1727204115.32961: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204115.32966: Calling groups_plugins_play to load vars for managed-node2 16142 1727204115.34030: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000028 16142 1727204115.34033: WORKER PROCESS EXITING 16142 1727204115.36526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.39904: done with get_vars() 16142 1727204115.39939: done getting variables 16142 1727204115.40006: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.106) 0:00:14.577 ***** 16142 1727204115.40044: entering _queue_task() for managed-node2/fail 16142 1727204115.40370: worker is 1 (out of 1 available) 16142 1727204115.40386: exiting _queue_task() for managed-node2/fail 16142 1727204115.40397: done queuing things up, now waiting for results queue to drain 16142 1727204115.40398: waiting for pending results... 16142 1727204115.40850: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204115.41147: in run() - task 0affcd87-79f5-fddd-f6c7-000000000029 16142 1727204115.41197: variable 'ansible_search_path' from source: unknown 16142 1727204115.41213: variable 'ansible_search_path' from source: unknown 16142 1727204115.41263: calling self._execute() 16142 1727204115.41420: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.41437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.41452: variable 'omit' from source: magic vars 16142 1727204115.41867: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.41886: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204115.42119: variable 'network_state' from source: role '' defaults 16142 1727204115.42141: Evaluated conditional (network_state != {}): False 16142 1727204115.42167: when evaluation is False, skipping this task 16142 1727204115.42177: _execute() done 16142 1727204115.42201: dumping result to json 16142 1727204115.42228: done dumping result, returning 16142 1727204115.42251: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-fddd-f6c7-000000000029] 16142 1727204115.42272: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000029 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204115.42521: no more pending results, returning what we have 16142 1727204115.42526: results queue empty 16142 1727204115.42527: checking for any_errors_fatal 16142 1727204115.42536: done checking for any_errors_fatal 16142 1727204115.42537: checking for max_fail_percentage 16142 1727204115.42551: done checking for max_fail_percentage 16142 1727204115.42553: checking to see if all hosts have failed and the running result is not ok 16142 1727204115.42554: done checking to see if all hosts have failed 16142 1727204115.42554: getting the remaining hosts for this loop 16142 1727204115.42556: done getting the remaining hosts for this loop 16142 1727204115.42573: getting the next task for host managed-node2 16142 1727204115.42587: done getting next task for host managed-node2 16142 1727204115.42601: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204115.42605: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204115.42623: getting variables 16142 1727204115.42626: in VariableManager get_vars() 16142 1727204115.42710: Calling all_inventory to load vars for managed-node2 16142 1727204115.42713: Calling groups_inventory to load vars for managed-node2 16142 1727204115.42715: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204115.42748: Calling all_plugins_play to load vars for managed-node2 16142 1727204115.42752: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204115.42756: Calling groups_plugins_play to load vars for managed-node2 16142 1727204115.43858: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000029 16142 1727204115.43875: WORKER PROCESS EXITING 16142 1727204115.45073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.48540: done with get_vars() 16142 1727204115.48758: done getting variables 16142 1727204115.49073: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.090) 0:00:14.667 ***** 16142 1727204115.49106: entering _queue_task() for managed-node2/fail 16142 1727204115.50129: worker is 1 (out of 1 available) 16142 1727204115.50143: exiting _queue_task() for managed-node2/fail 16142 1727204115.50156: done queuing things up, now waiting for results queue to drain 16142 1727204115.50157: waiting for pending results... 16142 1727204115.50448: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204115.50618: in run() - task 0affcd87-79f5-fddd-f6c7-00000000002a 16142 1727204115.50639: variable 'ansible_search_path' from source: unknown 16142 1727204115.50646: variable 'ansible_search_path' from source: unknown 16142 1727204115.50688: calling self._execute() 16142 1727204115.50779: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.50791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.50803: variable 'omit' from source: magic vars 16142 1727204115.51455: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.51478: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204115.51718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204115.56569: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204115.56713: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204115.56819: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204115.56905: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204115.56940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204115.57025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.57062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.57098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.57148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.57170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.57301: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.57324: Evaluated conditional (ansible_distribution_major_version | int > 9): False 16142 1727204115.57336: when evaluation is False, skipping this task 16142 1727204115.57345: _execute() done 16142 1727204115.57352: dumping result to json 16142 1727204115.57359: done dumping result, returning 16142 1727204115.57374: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-fddd-f6c7-00000000002a] 16142 1727204115.57385: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002a 16142 1727204115.57499: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002a skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 16142 1727204115.57551: no more pending results, returning what we have 16142 1727204115.57555: results queue empty 16142 1727204115.57557: checking for any_errors_fatal 16142 1727204115.57561: done checking for any_errors_fatal 16142 1727204115.57562: checking for max_fail_percentage 16142 1727204115.57567: done checking for max_fail_percentage 16142 1727204115.57568: checking to see if all hosts have failed and the running result is not ok 16142 1727204115.57569: done checking to see if all hosts have failed 16142 1727204115.57569: getting the remaining hosts for this loop 16142 1727204115.57571: done getting the remaining hosts for this loop 16142 1727204115.57576: getting the next task for host managed-node2 16142 1727204115.57609: done getting next task for host managed-node2 16142 1727204115.57615: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204115.57618: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204115.57636: getting variables 16142 1727204115.57638: in VariableManager get_vars() 16142 1727204115.57852: Calling all_inventory to load vars for managed-node2 16142 1727204115.57856: Calling groups_inventory to load vars for managed-node2 16142 1727204115.57859: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204115.57887: Calling all_plugins_play to load vars for managed-node2 16142 1727204115.57892: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204115.57896: Calling groups_plugins_play to load vars for managed-node2 16142 1727204115.59105: WORKER PROCESS EXITING 16142 1727204115.60554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.62696: done with get_vars() 16142 1727204115.62726: done getting variables 16142 1727204115.62959: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.138) 0:00:14.806 ***** 16142 1727204115.63016: entering _queue_task() for managed-node2/dnf 16142 1727204115.63404: worker is 1 (out of 1 available) 16142 1727204115.63456: exiting _queue_task() for managed-node2/dnf 16142 1727204115.63520: done queuing things up, now waiting for results queue to drain 16142 1727204115.63546: waiting for pending results... 16142 1727204115.64067: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204115.64216: in run() - task 0affcd87-79f5-fddd-f6c7-00000000002b 16142 1727204115.64237: variable 'ansible_search_path' from source: unknown 16142 1727204115.64249: variable 'ansible_search_path' from source: unknown 16142 1727204115.64299: calling self._execute() 16142 1727204115.64430: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.64445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.64487: variable 'omit' from source: magic vars 16142 1727204115.65419: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.65444: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204115.65714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204115.69888: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204115.70022: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204115.70568: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204115.70608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204115.70640: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204115.70718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.70754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.70843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.70891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.70994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.71119: variable 'ansible_distribution' from source: facts 16142 1727204115.71129: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.71150: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 16142 1727204115.71281: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204115.71682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.71713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.71744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.71795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.71814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.71859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.71894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.71922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.71969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.71993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.72036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.72065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.72123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.72242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.72262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.72429: variable 'network_connections' from source: task vars 16142 1727204115.72447: variable 'controller_profile' from source: play vars 16142 1727204115.72519: variable 'controller_profile' from source: play vars 16142 1727204115.72538: variable 'controller_device' from source: play vars 16142 1727204115.72627: variable 'controller_device' from source: play vars 16142 1727204115.72647: variable 'port1_profile' from source: play vars 16142 1727204115.72730: variable 'port1_profile' from source: play vars 16142 1727204115.72744: variable 'dhcp_interface1' from source: play vars 16142 1727204115.72809: variable 'dhcp_interface1' from source: play vars 16142 1727204115.72841: variable 'controller_profile' from source: play vars 16142 1727204115.72942: variable 'controller_profile' from source: play vars 16142 1727204115.72994: variable 'port2_profile' from source: play vars 16142 1727204115.73096: variable 'port2_profile' from source: play vars 16142 1727204115.73111: variable 'dhcp_interface2' from source: play vars 16142 1727204115.73398: variable 'dhcp_interface2' from source: play vars 16142 1727204115.73411: variable 'controller_profile' from source: play vars 16142 1727204115.73479: variable 'controller_profile' from source: play vars 16142 1727204115.73556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204115.73747: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204115.73796: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204115.73832: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204115.73869: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204115.73922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204115.73970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204115.74004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.74038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204115.74104: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204115.74403: variable 'network_connections' from source: task vars 16142 1727204115.74415: variable 'controller_profile' from source: play vars 16142 1727204115.74485: variable 'controller_profile' from source: play vars 16142 1727204115.74497: variable 'controller_device' from source: play vars 16142 1727204115.74606: variable 'controller_device' from source: play vars 16142 1727204115.74619: variable 'port1_profile' from source: play vars 16142 1727204115.74688: variable 'port1_profile' from source: play vars 16142 1727204115.74700: variable 'dhcp_interface1' from source: play vars 16142 1727204115.74759: variable 'dhcp_interface1' from source: play vars 16142 1727204115.74821: variable 'controller_profile' from source: play vars 16142 1727204115.74881: variable 'controller_profile' from source: play vars 16142 1727204115.75603: variable 'port2_profile' from source: play vars 16142 1727204115.75671: variable 'port2_profile' from source: play vars 16142 1727204115.75683: variable 'dhcp_interface2' from source: play vars 16142 1727204115.75816: variable 'dhcp_interface2' from source: play vars 16142 1727204115.75828: variable 'controller_profile' from source: play vars 16142 1727204115.75891: variable 'controller_profile' from source: play vars 16142 1727204115.75935: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204115.75943: when evaluation is False, skipping this task 16142 1727204115.75950: _execute() done 16142 1727204115.75956: dumping result to json 16142 1727204115.75963: done dumping result, returning 16142 1727204115.75977: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-00000000002b] 16142 1727204115.75987: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002b 16142 1727204115.76215: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002b 16142 1727204115.76222: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204115.76290: no more pending results, returning what we have 16142 1727204115.76295: results queue empty 16142 1727204115.76296: checking for any_errors_fatal 16142 1727204115.76302: done checking for any_errors_fatal 16142 1727204115.76303: checking for max_fail_percentage 16142 1727204115.76306: done checking for max_fail_percentage 16142 1727204115.76306: checking to see if all hosts have failed and the running result is not ok 16142 1727204115.76307: done checking to see if all hosts have failed 16142 1727204115.76308: getting the remaining hosts for this loop 16142 1727204115.76309: done getting the remaining hosts for this loop 16142 1727204115.76314: getting the next task for host managed-node2 16142 1727204115.76321: done getting next task for host managed-node2 16142 1727204115.76326: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204115.76328: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204115.76343: getting variables 16142 1727204115.76345: in VariableManager get_vars() 16142 1727204115.76403: Calling all_inventory to load vars for managed-node2 16142 1727204115.76406: Calling groups_inventory to load vars for managed-node2 16142 1727204115.76409: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204115.76420: Calling all_plugins_play to load vars for managed-node2 16142 1727204115.76423: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204115.76426: Calling groups_plugins_play to load vars for managed-node2 16142 1727204115.78105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.80472: done with get_vars() 16142 1727204115.80497: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204115.80578: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.175) 0:00:14.982 ***** 16142 1727204115.80612: entering _queue_task() for managed-node2/yum 16142 1727204115.80614: Creating lock for yum 16142 1727204115.80923: worker is 1 (out of 1 available) 16142 1727204115.80936: exiting _queue_task() for managed-node2/yum 16142 1727204115.80947: done queuing things up, now waiting for results queue to drain 16142 1727204115.80949: waiting for pending results... 16142 1727204115.81400: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204115.81542: in run() - task 0affcd87-79f5-fddd-f6c7-00000000002c 16142 1727204115.81565: variable 'ansible_search_path' from source: unknown 16142 1727204115.81575: variable 'ansible_search_path' from source: unknown 16142 1727204115.81616: calling self._execute() 16142 1727204115.81702: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.81713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.81727: variable 'omit' from source: magic vars 16142 1727204115.82104: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.82125: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204115.82303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204115.85437: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204115.85612: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204115.85677: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204115.85784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204115.85884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204115.86083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.86119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.86151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.86223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.86308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.86526: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.86552: Evaluated conditional (ansible_distribution_major_version | int < 8): False 16142 1727204115.86620: when evaluation is False, skipping this task 16142 1727204115.86628: _execute() done 16142 1727204115.86636: dumping result to json 16142 1727204115.86643: done dumping result, returning 16142 1727204115.86656: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-00000000002c] 16142 1727204115.86682: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002c skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 16142 1727204115.86877: no more pending results, returning what we have 16142 1727204115.86882: results queue empty 16142 1727204115.86883: checking for any_errors_fatal 16142 1727204115.86890: done checking for any_errors_fatal 16142 1727204115.86891: checking for max_fail_percentage 16142 1727204115.86893: done checking for max_fail_percentage 16142 1727204115.86894: checking to see if all hosts have failed and the running result is not ok 16142 1727204115.86895: done checking to see if all hosts have failed 16142 1727204115.86895: getting the remaining hosts for this loop 16142 1727204115.86897: done getting the remaining hosts for this loop 16142 1727204115.86901: getting the next task for host managed-node2 16142 1727204115.86913: done getting next task for host managed-node2 16142 1727204115.86918: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204115.86921: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204115.86936: getting variables 16142 1727204115.86938: in VariableManager get_vars() 16142 1727204115.86998: Calling all_inventory to load vars for managed-node2 16142 1727204115.87002: Calling groups_inventory to load vars for managed-node2 16142 1727204115.87004: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204115.87014: Calling all_plugins_play to load vars for managed-node2 16142 1727204115.87017: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204115.87020: Calling groups_plugins_play to load vars for managed-node2 16142 1727204115.88689: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002c 16142 1727204115.88694: WORKER PROCESS EXITING 16142 1727204115.89697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204115.92400: done with get_vars() 16142 1727204115.92434: done getting variables 16142 1727204115.92507: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.119) 0:00:15.102 ***** 16142 1727204115.92545: entering _queue_task() for managed-node2/fail 16142 1727204115.92880: worker is 1 (out of 1 available) 16142 1727204115.92894: exiting _queue_task() for managed-node2/fail 16142 1727204115.92906: done queuing things up, now waiting for results queue to drain 16142 1727204115.92907: waiting for pending results... 16142 1727204115.93298: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204115.93442: in run() - task 0affcd87-79f5-fddd-f6c7-00000000002d 16142 1727204115.93467: variable 'ansible_search_path' from source: unknown 16142 1727204115.93476: variable 'ansible_search_path' from source: unknown 16142 1727204115.93522: calling self._execute() 16142 1727204115.93614: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204115.93627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204115.93641: variable 'omit' from source: magic vars 16142 1727204115.94007: variable 'ansible_distribution_major_version' from source: facts 16142 1727204115.94030: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204115.94145: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204115.94344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204115.96795: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204115.96886: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204115.96929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204115.96974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204115.97009: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204115.97102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.97135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.97893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.97943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.97962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.98020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.98129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.98158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.98202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.98336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.98387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204115.98415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204115.98452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204115.98578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204115.98598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204115.98933: variable 'network_connections' from source: task vars 16142 1727204115.99095: variable 'controller_profile' from source: play vars 16142 1727204115.99181: variable 'controller_profile' from source: play vars 16142 1727204115.99201: variable 'controller_device' from source: play vars 16142 1727204115.99367: variable 'controller_device' from source: play vars 16142 1727204115.99383: variable 'port1_profile' from source: play vars 16142 1727204115.99560: variable 'port1_profile' from source: play vars 16142 1727204115.99574: variable 'dhcp_interface1' from source: play vars 16142 1727204115.99748: variable 'dhcp_interface1' from source: play vars 16142 1727204115.99759: variable 'controller_profile' from source: play vars 16142 1727204115.99818: variable 'controller_profile' from source: play vars 16142 1727204115.99831: variable 'port2_profile' from source: play vars 16142 1727204115.99895: variable 'port2_profile' from source: play vars 16142 1727204115.99958: variable 'dhcp_interface2' from source: play vars 16142 1727204116.00115: variable 'dhcp_interface2' from source: play vars 16142 1727204116.00127: variable 'controller_profile' from source: play vars 16142 1727204116.00299: variable 'controller_profile' from source: play vars 16142 1727204116.00470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204116.00741: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204116.00788: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204116.00828: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204116.00861: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204116.00910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204116.00940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204116.00971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.01001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204116.01084: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204116.02051: variable 'network_connections' from source: task vars 16142 1727204116.02125: variable 'controller_profile' from source: play vars 16142 1727204116.02195: variable 'controller_profile' from source: play vars 16142 1727204116.02343: variable 'controller_device' from source: play vars 16142 1727204116.02408: variable 'controller_device' from source: play vars 16142 1727204116.02424: variable 'port1_profile' from source: play vars 16142 1727204116.02490: variable 'port1_profile' from source: play vars 16142 1727204116.02503: variable 'dhcp_interface1' from source: play vars 16142 1727204116.02570: variable 'dhcp_interface1' from source: play vars 16142 1727204116.02583: variable 'controller_profile' from source: play vars 16142 1727204116.02647: variable 'controller_profile' from source: play vars 16142 1727204116.02665: variable 'port2_profile' from source: play vars 16142 1727204116.02726: variable 'port2_profile' from source: play vars 16142 1727204116.02738: variable 'dhcp_interface2' from source: play vars 16142 1727204116.02804: variable 'dhcp_interface2' from source: play vars 16142 1727204116.02818: variable 'controller_profile' from source: play vars 16142 1727204116.02886: variable 'controller_profile' from source: play vars 16142 1727204116.02928: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204116.02936: when evaluation is False, skipping this task 16142 1727204116.02944: _execute() done 16142 1727204116.02950: dumping result to json 16142 1727204116.02957: done dumping result, returning 16142 1727204116.02971: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-00000000002d] 16142 1727204116.02984: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002d skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204116.03139: no more pending results, returning what we have 16142 1727204116.03143: results queue empty 16142 1727204116.03145: checking for any_errors_fatal 16142 1727204116.03149: done checking for any_errors_fatal 16142 1727204116.03150: checking for max_fail_percentage 16142 1727204116.03152: done checking for max_fail_percentage 16142 1727204116.03153: checking to see if all hosts have failed and the running result is not ok 16142 1727204116.03154: done checking to see if all hosts have failed 16142 1727204116.03155: getting the remaining hosts for this loop 16142 1727204116.03156: done getting the remaining hosts for this loop 16142 1727204116.03160: getting the next task for host managed-node2 16142 1727204116.03171: done getting next task for host managed-node2 16142 1727204116.03176: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 16142 1727204116.03179: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204116.03196: getting variables 16142 1727204116.03198: in VariableManager get_vars() 16142 1727204116.03260: Calling all_inventory to load vars for managed-node2 16142 1727204116.03265: Calling groups_inventory to load vars for managed-node2 16142 1727204116.03268: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204116.03278: Calling all_plugins_play to load vars for managed-node2 16142 1727204116.03281: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204116.03284: Calling groups_plugins_play to load vars for managed-node2 16142 1727204116.04284: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002d 16142 1727204116.04288: WORKER PROCESS EXITING 16142 1727204116.05155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204116.06776: done with get_vars() 16142 1727204116.06806: done getting variables 16142 1727204116.06869: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.143) 0:00:15.245 ***** 16142 1727204116.06905: entering _queue_task() for managed-node2/package 16142 1727204116.07219: worker is 1 (out of 1 available) 16142 1727204116.07232: exiting _queue_task() for managed-node2/package 16142 1727204116.07245: done queuing things up, now waiting for results queue to drain 16142 1727204116.07246: waiting for pending results... 16142 1727204116.07524: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 16142 1727204116.07667: in run() - task 0affcd87-79f5-fddd-f6c7-00000000002e 16142 1727204116.07692: variable 'ansible_search_path' from source: unknown 16142 1727204116.07700: variable 'ansible_search_path' from source: unknown 16142 1727204116.07742: calling self._execute() 16142 1727204116.07838: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204116.07849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204116.07862: variable 'omit' from source: magic vars 16142 1727204116.08239: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.08258: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204116.08463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204116.08739: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204116.08796: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204116.08836: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204116.08875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204116.08995: variable 'network_packages' from source: role '' defaults 16142 1727204116.09108: variable '__network_provider_setup' from source: role '' defaults 16142 1727204116.09125: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204116.09191: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204116.09206: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204116.09275: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204116.09470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204116.11732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204116.11804: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204116.11850: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204116.11901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204116.11936: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204116.12018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.12056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.12090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.12136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.12159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.12207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.12236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.12271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.12316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.12335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.12582: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204116.12702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.12730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.12759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.12808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.12829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.12928: variable 'ansible_python' from source: facts 16142 1727204116.12959: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204116.13054: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204116.13145: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204116.13286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.13315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.13349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.13394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.13410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.13458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.13497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.13557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.13597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.13613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.13757: variable 'network_connections' from source: task vars 16142 1727204116.13773: variable 'controller_profile' from source: play vars 16142 1727204116.13867: variable 'controller_profile' from source: play vars 16142 1727204116.13888: variable 'controller_device' from source: play vars 16142 1727204116.13994: variable 'controller_device' from source: play vars 16142 1727204116.14014: variable 'port1_profile' from source: play vars 16142 1727204116.14120: variable 'port1_profile' from source: play vars 16142 1727204116.14135: variable 'dhcp_interface1' from source: play vars 16142 1727204116.14239: variable 'dhcp_interface1' from source: play vars 16142 1727204116.14252: variable 'controller_profile' from source: play vars 16142 1727204116.14465: variable 'controller_profile' from source: play vars 16142 1727204116.14481: variable 'port2_profile' from source: play vars 16142 1727204116.14584: variable 'port2_profile' from source: play vars 16142 1727204116.14758: variable 'dhcp_interface2' from source: play vars 16142 1727204116.14860: variable 'dhcp_interface2' from source: play vars 16142 1727204116.14976: variable 'controller_profile' from source: play vars 16142 1727204116.15190: variable 'controller_profile' from source: play vars 16142 1727204116.15276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204116.15314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204116.15428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.15461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204116.15556: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204116.16114: variable 'network_connections' from source: task vars 16142 1727204116.16169: variable 'controller_profile' from source: play vars 16142 1727204116.16366: variable 'controller_profile' from source: play vars 16142 1727204116.16392: variable 'controller_device' from source: play vars 16142 1727204116.16587: variable 'controller_device' from source: play vars 16142 1727204116.16720: variable 'port1_profile' from source: play vars 16142 1727204116.16933: variable 'port1_profile' from source: play vars 16142 1727204116.16949: variable 'dhcp_interface1' from source: play vars 16142 1727204116.17171: variable 'dhcp_interface1' from source: play vars 16142 1727204116.17186: variable 'controller_profile' from source: play vars 16142 1727204116.17298: variable 'controller_profile' from source: play vars 16142 1727204116.17372: variable 'port2_profile' from source: play vars 16142 1727204116.17576: variable 'port2_profile' from source: play vars 16142 1727204116.17591: variable 'dhcp_interface2' from source: play vars 16142 1727204116.17900: variable 'dhcp_interface2' from source: play vars 16142 1727204116.17915: variable 'controller_profile' from source: play vars 16142 1727204116.18015: variable 'controller_profile' from source: play vars 16142 1727204116.18158: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204116.18350: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204116.19016: variable 'network_connections' from source: task vars 16142 1727204116.19026: variable 'controller_profile' from source: play vars 16142 1727204116.19154: variable 'controller_profile' from source: play vars 16142 1727204116.19179: variable 'controller_device' from source: play vars 16142 1727204116.19251: variable 'controller_device' from source: play vars 16142 1727204116.19267: variable 'port1_profile' from source: play vars 16142 1727204116.19332: variable 'port1_profile' from source: play vars 16142 1727204116.19350: variable 'dhcp_interface1' from source: play vars 16142 1727204116.19430: variable 'dhcp_interface1' from source: play vars 16142 1727204116.19441: variable 'controller_profile' from source: play vars 16142 1727204116.19507: variable 'controller_profile' from source: play vars 16142 1727204116.19523: variable 'port2_profile' from source: play vars 16142 1727204116.19605: variable 'port2_profile' from source: play vars 16142 1727204116.19618: variable 'dhcp_interface2' from source: play vars 16142 1727204116.19691: variable 'dhcp_interface2' from source: play vars 16142 1727204116.19702: variable 'controller_profile' from source: play vars 16142 1727204116.19770: variable 'controller_profile' from source: play vars 16142 1727204116.19800: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204116.19884: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204116.20229: variable 'network_connections' from source: task vars 16142 1727204116.20239: variable 'controller_profile' from source: play vars 16142 1727204116.20309: variable 'controller_profile' from source: play vars 16142 1727204116.20320: variable 'controller_device' from source: play vars 16142 1727204116.20406: variable 'controller_device' from source: play vars 16142 1727204116.20421: variable 'port1_profile' from source: play vars 16142 1727204116.20488: variable 'port1_profile' from source: play vars 16142 1727204116.20505: variable 'dhcp_interface1' from source: play vars 16142 1727204116.20569: variable 'dhcp_interface1' from source: play vars 16142 1727204116.20580: variable 'controller_profile' from source: play vars 16142 1727204116.20672: variable 'controller_profile' from source: play vars 16142 1727204116.20684: variable 'port2_profile' from source: play vars 16142 1727204116.20778: variable 'port2_profile' from source: play vars 16142 1727204116.20790: variable 'dhcp_interface2' from source: play vars 16142 1727204116.20880: variable 'dhcp_interface2' from source: play vars 16142 1727204116.20891: variable 'controller_profile' from source: play vars 16142 1727204116.20960: variable 'controller_profile' from source: play vars 16142 1727204116.21043: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204116.21111: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204116.21138: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204116.21206: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204116.21818: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204116.22651: variable 'network_connections' from source: task vars 16142 1727204116.22661: variable 'controller_profile' from source: play vars 16142 1727204116.22728: variable 'controller_profile' from source: play vars 16142 1727204116.22749: variable 'controller_device' from source: play vars 16142 1727204116.22814: variable 'controller_device' from source: play vars 16142 1727204116.22827: variable 'port1_profile' from source: play vars 16142 1727204116.22905: variable 'port1_profile' from source: play vars 16142 1727204116.22969: variable 'dhcp_interface1' from source: play vars 16142 1727204116.23029: variable 'dhcp_interface1' from source: play vars 16142 1727204116.23144: variable 'controller_profile' from source: play vars 16142 1727204116.23215: variable 'controller_profile' from source: play vars 16142 1727204116.23228: variable 'port2_profile' from source: play vars 16142 1727204116.23294: variable 'port2_profile' from source: play vars 16142 1727204116.23305: variable 'dhcp_interface2' from source: play vars 16142 1727204116.23365: variable 'dhcp_interface2' from source: play vars 16142 1727204116.23377: variable 'controller_profile' from source: play vars 16142 1727204116.23439: variable 'controller_profile' from source: play vars 16142 1727204116.23452: variable 'ansible_distribution' from source: facts 16142 1727204116.23459: variable '__network_rh_distros' from source: role '' defaults 16142 1727204116.23471: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.23504: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204116.23675: variable 'ansible_distribution' from source: facts 16142 1727204116.23684: variable '__network_rh_distros' from source: role '' defaults 16142 1727204116.23693: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.23715: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204116.23880: variable 'ansible_distribution' from source: facts 16142 1727204116.23889: variable '__network_rh_distros' from source: role '' defaults 16142 1727204116.23899: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.23945: variable 'network_provider' from source: set_fact 16142 1727204116.23970: variable 'ansible_facts' from source: unknown 16142 1727204116.24982: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 16142 1727204116.24993: when evaluation is False, skipping this task 16142 1727204116.25000: _execute() done 16142 1727204116.25007: dumping result to json 16142 1727204116.25020: done dumping result, returning 16142 1727204116.25036: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-fddd-f6c7-00000000002e] 16142 1727204116.25048: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002e skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 16142 1727204116.25242: no more pending results, returning what we have 16142 1727204116.25247: results queue empty 16142 1727204116.25249: checking for any_errors_fatal 16142 1727204116.25256: done checking for any_errors_fatal 16142 1727204116.25256: checking for max_fail_percentage 16142 1727204116.25259: done checking for max_fail_percentage 16142 1727204116.25260: checking to see if all hosts have failed and the running result is not ok 16142 1727204116.25260: done checking to see if all hosts have failed 16142 1727204116.25261: getting the remaining hosts for this loop 16142 1727204116.25262: done getting the remaining hosts for this loop 16142 1727204116.25268: getting the next task for host managed-node2 16142 1727204116.25277: done getting next task for host managed-node2 16142 1727204116.25282: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204116.25285: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204116.25300: getting variables 16142 1727204116.25302: in VariableManager get_vars() 16142 1727204116.25361: Calling all_inventory to load vars for managed-node2 16142 1727204116.25367: Calling groups_inventory to load vars for managed-node2 16142 1727204116.25370: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204116.25380: Calling all_plugins_play to load vars for managed-node2 16142 1727204116.25383: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204116.25387: Calling groups_plugins_play to load vars for managed-node2 16142 1727204116.26390: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002e 16142 1727204116.26394: WORKER PROCESS EXITING 16142 1727204116.28851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204116.30945: done with get_vars() 16142 1727204116.30992: done getting variables 16142 1727204116.31069: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.242) 0:00:15.487 ***** 16142 1727204116.31115: entering _queue_task() for managed-node2/package 16142 1727204116.31810: worker is 1 (out of 1 available) 16142 1727204116.31824: exiting _queue_task() for managed-node2/package 16142 1727204116.31838: done queuing things up, now waiting for results queue to drain 16142 1727204116.31839: waiting for pending results... 16142 1727204116.32451: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204116.32608: in run() - task 0affcd87-79f5-fddd-f6c7-00000000002f 16142 1727204116.32623: variable 'ansible_search_path' from source: unknown 16142 1727204116.32626: variable 'ansible_search_path' from source: unknown 16142 1727204116.32667: calling self._execute() 16142 1727204116.32816: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204116.32819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204116.32880: variable 'omit' from source: magic vars 16142 1727204116.33713: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.33738: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204116.33908: variable 'network_state' from source: role '' defaults 16142 1727204116.33927: Evaluated conditional (network_state != {}): False 16142 1727204116.33939: when evaluation is False, skipping this task 16142 1727204116.33947: _execute() done 16142 1727204116.33954: dumping result to json 16142 1727204116.33961: done dumping result, returning 16142 1727204116.33977: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-00000000002f] 16142 1727204116.33990: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204116.34174: no more pending results, returning what we have 16142 1727204116.34178: results queue empty 16142 1727204116.34178: checking for any_errors_fatal 16142 1727204116.34183: done checking for any_errors_fatal 16142 1727204116.34184: checking for max_fail_percentage 16142 1727204116.34186: done checking for max_fail_percentage 16142 1727204116.34187: checking to see if all hosts have failed and the running result is not ok 16142 1727204116.34187: done checking to see if all hosts have failed 16142 1727204116.34188: getting the remaining hosts for this loop 16142 1727204116.34189: done getting the remaining hosts for this loop 16142 1727204116.34194: getting the next task for host managed-node2 16142 1727204116.34200: done getting next task for host managed-node2 16142 1727204116.34204: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204116.34207: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204116.34228: getting variables 16142 1727204116.34230: in VariableManager get_vars() 16142 1727204116.34287: Calling all_inventory to load vars for managed-node2 16142 1727204116.34290: Calling groups_inventory to load vars for managed-node2 16142 1727204116.34293: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204116.34305: Calling all_plugins_play to load vars for managed-node2 16142 1727204116.34307: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204116.34310: Calling groups_plugins_play to load vars for managed-node2 16142 1727204116.34834: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000002f 16142 1727204116.34838: WORKER PROCESS EXITING 16142 1727204116.45241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204116.46805: done with get_vars() 16142 1727204116.46832: done getting variables 16142 1727204116.46873: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.157) 0:00:15.645 ***** 16142 1727204116.46897: entering _queue_task() for managed-node2/package 16142 1727204116.47146: worker is 1 (out of 1 available) 16142 1727204116.47176: exiting _queue_task() for managed-node2/package 16142 1727204116.47188: done queuing things up, now waiting for results queue to drain 16142 1727204116.47190: waiting for pending results... 16142 1727204116.47478: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204116.47608: in run() - task 0affcd87-79f5-fddd-f6c7-000000000030 16142 1727204116.47756: variable 'ansible_search_path' from source: unknown 16142 1727204116.47760: variable 'ansible_search_path' from source: unknown 16142 1727204116.47763: calling self._execute() 16142 1727204116.47877: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204116.47907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204116.47911: variable 'omit' from source: magic vars 16142 1727204116.48511: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.48538: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204116.48709: variable 'network_state' from source: role '' defaults 16142 1727204116.48729: Evaluated conditional (network_state != {}): False 16142 1727204116.48732: when evaluation is False, skipping this task 16142 1727204116.48749: _execute() done 16142 1727204116.48753: dumping result to json 16142 1727204116.48756: done dumping result, returning 16142 1727204116.48768: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-000000000030] 16142 1727204116.48807: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000030 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204116.49056: no more pending results, returning what we have 16142 1727204116.49061: results queue empty 16142 1727204116.49066: checking for any_errors_fatal 16142 1727204116.49088: done checking for any_errors_fatal 16142 1727204116.49089: checking for max_fail_percentage 16142 1727204116.49091: done checking for max_fail_percentage 16142 1727204116.49093: checking to see if all hosts have failed and the running result is not ok 16142 1727204116.49093: done checking to see if all hosts have failed 16142 1727204116.49094: getting the remaining hosts for this loop 16142 1727204116.49096: done getting the remaining hosts for this loop 16142 1727204116.49100: getting the next task for host managed-node2 16142 1727204116.49107: done getting next task for host managed-node2 16142 1727204116.49117: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204116.49122: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204116.49146: getting variables 16142 1727204116.49149: in VariableManager get_vars() 16142 1727204116.49227: Calling all_inventory to load vars for managed-node2 16142 1727204116.49231: Calling groups_inventory to load vars for managed-node2 16142 1727204116.49237: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204116.49253: Calling all_plugins_play to load vars for managed-node2 16142 1727204116.49261: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204116.49326: Calling groups_plugins_play to load vars for managed-node2 16142 1727204116.49850: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000030 16142 1727204116.49854: WORKER PROCESS EXITING 16142 1727204116.51101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204116.53017: done with get_vars() 16142 1727204116.53042: done getting variables 16142 1727204116.53122: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.062) 0:00:15.708 ***** 16142 1727204116.53151: entering _queue_task() for managed-node2/service 16142 1727204116.53153: Creating lock for service 16142 1727204116.53400: worker is 1 (out of 1 available) 16142 1727204116.53415: exiting _queue_task() for managed-node2/service 16142 1727204116.53427: done queuing things up, now waiting for results queue to drain 16142 1727204116.53429: waiting for pending results... 16142 1727204116.53608: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204116.53698: in run() - task 0affcd87-79f5-fddd-f6c7-000000000031 16142 1727204116.53709: variable 'ansible_search_path' from source: unknown 16142 1727204116.53713: variable 'ansible_search_path' from source: unknown 16142 1727204116.53748: calling self._execute() 16142 1727204116.53823: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204116.53826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204116.53839: variable 'omit' from source: magic vars 16142 1727204116.54192: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.54220: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204116.54355: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204116.54550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204116.57095: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204116.57168: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204116.57204: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204116.57238: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204116.57265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204116.57340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.57368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.57397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.57438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.57452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.57500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.57521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.57545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.57585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.57597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.57638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.57659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.57685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.57721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.57736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.57898: variable 'network_connections' from source: task vars 16142 1727204116.57910: variable 'controller_profile' from source: play vars 16142 1727204116.57981: variable 'controller_profile' from source: play vars 16142 1727204116.57990: variable 'controller_device' from source: play vars 16142 1727204116.58050: variable 'controller_device' from source: play vars 16142 1727204116.58059: variable 'port1_profile' from source: play vars 16142 1727204116.58116: variable 'port1_profile' from source: play vars 16142 1727204116.58124: variable 'dhcp_interface1' from source: play vars 16142 1727204116.58185: variable 'dhcp_interface1' from source: play vars 16142 1727204116.58192: variable 'controller_profile' from source: play vars 16142 1727204116.58248: variable 'controller_profile' from source: play vars 16142 1727204116.58254: variable 'port2_profile' from source: play vars 16142 1727204116.58313: variable 'port2_profile' from source: play vars 16142 1727204116.58316: variable 'dhcp_interface2' from source: play vars 16142 1727204116.58375: variable 'dhcp_interface2' from source: play vars 16142 1727204116.58378: variable 'controller_profile' from source: play vars 16142 1727204116.58438: variable 'controller_profile' from source: play vars 16142 1727204116.58504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204116.58683: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204116.58720: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204116.58747: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204116.58778: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204116.58819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204116.58841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204116.58870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.58900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204116.58960: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204116.59472: variable 'network_connections' from source: task vars 16142 1727204116.59476: variable 'controller_profile' from source: play vars 16142 1727204116.59478: variable 'controller_profile' from source: play vars 16142 1727204116.59481: variable 'controller_device' from source: play vars 16142 1727204116.59483: variable 'controller_device' from source: play vars 16142 1727204116.59485: variable 'port1_profile' from source: play vars 16142 1727204116.59487: variable 'port1_profile' from source: play vars 16142 1727204116.59489: variable 'dhcp_interface1' from source: play vars 16142 1727204116.59490: variable 'dhcp_interface1' from source: play vars 16142 1727204116.59492: variable 'controller_profile' from source: play vars 16142 1727204116.59511: variable 'controller_profile' from source: play vars 16142 1727204116.59518: variable 'port2_profile' from source: play vars 16142 1727204116.59578: variable 'port2_profile' from source: play vars 16142 1727204116.59586: variable 'dhcp_interface2' from source: play vars 16142 1727204116.59642: variable 'dhcp_interface2' from source: play vars 16142 1727204116.59648: variable 'controller_profile' from source: play vars 16142 1727204116.59706: variable 'controller_profile' from source: play vars 16142 1727204116.59742: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204116.59745: when evaluation is False, skipping this task 16142 1727204116.59748: _execute() done 16142 1727204116.59750: dumping result to json 16142 1727204116.59752: done dumping result, returning 16142 1727204116.59762: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000031] 16142 1727204116.59770: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000031 16142 1727204116.59865: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000031 16142 1727204116.59868: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204116.59911: no more pending results, returning what we have 16142 1727204116.59915: results queue empty 16142 1727204116.59916: checking for any_errors_fatal 16142 1727204116.59923: done checking for any_errors_fatal 16142 1727204116.59924: checking for max_fail_percentage 16142 1727204116.59926: done checking for max_fail_percentage 16142 1727204116.59926: checking to see if all hosts have failed and the running result is not ok 16142 1727204116.59927: done checking to see if all hosts have failed 16142 1727204116.59928: getting the remaining hosts for this loop 16142 1727204116.59929: done getting the remaining hosts for this loop 16142 1727204116.59935: getting the next task for host managed-node2 16142 1727204116.59941: done getting next task for host managed-node2 16142 1727204116.59945: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204116.59948: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204116.59961: getting variables 16142 1727204116.59963: in VariableManager get_vars() 16142 1727204116.60019: Calling all_inventory to load vars for managed-node2 16142 1727204116.60022: Calling groups_inventory to load vars for managed-node2 16142 1727204116.60024: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204116.60035: Calling all_plugins_play to load vars for managed-node2 16142 1727204116.60038: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204116.60040: Calling groups_plugins_play to load vars for managed-node2 16142 1727204116.61606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204116.63367: done with get_vars() 16142 1727204116.63408: done getting variables 16142 1727204116.63481: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.103) 0:00:15.812 ***** 16142 1727204116.63522: entering _queue_task() for managed-node2/service 16142 1727204116.63881: worker is 1 (out of 1 available) 16142 1727204116.63897: exiting _queue_task() for managed-node2/service 16142 1727204116.63909: done queuing things up, now waiting for results queue to drain 16142 1727204116.63911: waiting for pending results... 16142 1727204116.64223: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204116.64402: in run() - task 0affcd87-79f5-fddd-f6c7-000000000032 16142 1727204116.64424: variable 'ansible_search_path' from source: unknown 16142 1727204116.64435: variable 'ansible_search_path' from source: unknown 16142 1727204116.64487: calling self._execute() 16142 1727204116.64601: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204116.64617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204116.64635: variable 'omit' from source: magic vars 16142 1727204116.65059: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.65080: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204116.65271: variable 'network_provider' from source: set_fact 16142 1727204116.65284: variable 'network_state' from source: role '' defaults 16142 1727204116.65299: Evaluated conditional (network_provider == "nm" or network_state != {}): True 16142 1727204116.65310: variable 'omit' from source: magic vars 16142 1727204116.65377: variable 'omit' from source: magic vars 16142 1727204116.65412: variable 'network_service_name' from source: role '' defaults 16142 1727204116.65496: variable 'network_service_name' from source: role '' defaults 16142 1727204116.65617: variable '__network_provider_setup' from source: role '' defaults 16142 1727204116.65628: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204116.65709: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204116.65723: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204116.65797: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204116.66042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204116.68473: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204116.68556: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204116.68609: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204116.68657: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204116.68690: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204116.68782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.68821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.68859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.68905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.68929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.68987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.69016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.69053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.69103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.69123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.69405: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204116.69543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.69580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.69615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.69663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.69685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.69794: variable 'ansible_python' from source: facts 16142 1727204116.69826: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204116.69921: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204116.70019: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204116.70168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.70198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.70235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.70285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.70305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.70361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204116.70397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204116.70422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.70468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204116.70490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204116.70655: variable 'network_connections' from source: task vars 16142 1727204116.70670: variable 'controller_profile' from source: play vars 16142 1727204116.70760: variable 'controller_profile' from source: play vars 16142 1727204116.70780: variable 'controller_device' from source: play vars 16142 1727204116.70864: variable 'controller_device' from source: play vars 16142 1727204116.70888: variable 'port1_profile' from source: play vars 16142 1727204116.70970: variable 'port1_profile' from source: play vars 16142 1727204116.70991: variable 'dhcp_interface1' from source: play vars 16142 1727204116.71075: variable 'dhcp_interface1' from source: play vars 16142 1727204116.71095: variable 'controller_profile' from source: play vars 16142 1727204116.71177: variable 'controller_profile' from source: play vars 16142 1727204116.71195: variable 'port2_profile' from source: play vars 16142 1727204116.71282: variable 'port2_profile' from source: play vars 16142 1727204116.71299: variable 'dhcp_interface2' from source: play vars 16142 1727204116.71386: variable 'dhcp_interface2' from source: play vars 16142 1727204116.71402: variable 'controller_profile' from source: play vars 16142 1727204116.71490: variable 'controller_profile' from source: play vars 16142 1727204116.71619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204116.71867: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204116.71928: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204116.71987: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204116.72041: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204116.72120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204116.72158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204116.72201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204116.72248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204116.72308: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204116.72626: variable 'network_connections' from source: task vars 16142 1727204116.72639: variable 'controller_profile' from source: play vars 16142 1727204116.72711: variable 'controller_profile' from source: play vars 16142 1727204116.72731: variable 'controller_device' from source: play vars 16142 1727204116.72811: variable 'controller_device' from source: play vars 16142 1727204116.72837: variable 'port1_profile' from source: play vars 16142 1727204116.72913: variable 'port1_profile' from source: play vars 16142 1727204116.72929: variable 'dhcp_interface1' from source: play vars 16142 1727204116.73006: variable 'dhcp_interface1' from source: play vars 16142 1727204116.73020: variable 'controller_profile' from source: play vars 16142 1727204116.73111: variable 'controller_profile' from source: play vars 16142 1727204116.73128: variable 'port2_profile' from source: play vars 16142 1727204116.73217: variable 'port2_profile' from source: play vars 16142 1727204116.73237: variable 'dhcp_interface2' from source: play vars 16142 1727204116.73324: variable 'dhcp_interface2' from source: play vars 16142 1727204116.73344: variable 'controller_profile' from source: play vars 16142 1727204116.73428: variable 'controller_profile' from source: play vars 16142 1727204116.73492: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204116.73588: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204116.73930: variable 'network_connections' from source: task vars 16142 1727204116.73943: variable 'controller_profile' from source: play vars 16142 1727204116.74024: variable 'controller_profile' from source: play vars 16142 1727204116.74043: variable 'controller_device' from source: play vars 16142 1727204116.74118: variable 'controller_device' from source: play vars 16142 1727204116.74136: variable 'port1_profile' from source: play vars 16142 1727204116.74213: variable 'port1_profile' from source: play vars 16142 1727204116.74226: variable 'dhcp_interface1' from source: play vars 16142 1727204116.74310: variable 'dhcp_interface1' from source: play vars 16142 1727204116.74322: variable 'controller_profile' from source: play vars 16142 1727204116.74405: variable 'controller_profile' from source: play vars 16142 1727204116.74418: variable 'port2_profile' from source: play vars 16142 1727204116.74502: variable 'port2_profile' from source: play vars 16142 1727204116.74518: variable 'dhcp_interface2' from source: play vars 16142 1727204116.74602: variable 'dhcp_interface2' from source: play vars 16142 1727204116.74620: variable 'controller_profile' from source: play vars 16142 1727204116.74699: variable 'controller_profile' from source: play vars 16142 1727204116.74736: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204116.74826: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204116.75359: variable 'network_connections' from source: task vars 16142 1727204116.75372: variable 'controller_profile' from source: play vars 16142 1727204116.75445: variable 'controller_profile' from source: play vars 16142 1727204116.75458: variable 'controller_device' from source: play vars 16142 1727204116.75541: variable 'controller_device' from source: play vars 16142 1727204116.75560: variable 'port1_profile' from source: play vars 16142 1727204116.75642: variable 'port1_profile' from source: play vars 16142 1727204116.75655: variable 'dhcp_interface1' from source: play vars 16142 1727204116.75749: variable 'dhcp_interface1' from source: play vars 16142 1727204116.75761: variable 'controller_profile' from source: play vars 16142 1727204116.75846: variable 'controller_profile' from source: play vars 16142 1727204116.75859: variable 'port2_profile' from source: play vars 16142 1727204116.75942: variable 'port2_profile' from source: play vars 16142 1727204116.75954: variable 'dhcp_interface2' from source: play vars 16142 1727204116.76039: variable 'dhcp_interface2' from source: play vars 16142 1727204116.76050: variable 'controller_profile' from source: play vars 16142 1727204116.76128: variable 'controller_profile' from source: play vars 16142 1727204116.76210: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204116.76289: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204116.76301: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204116.76375: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204116.76615: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204116.77175: variable 'network_connections' from source: task vars 16142 1727204116.77185: variable 'controller_profile' from source: play vars 16142 1727204116.77258: variable 'controller_profile' from source: play vars 16142 1727204116.77273: variable 'controller_device' from source: play vars 16142 1727204116.77348: variable 'controller_device' from source: play vars 16142 1727204116.77363: variable 'port1_profile' from source: play vars 16142 1727204116.77425: variable 'port1_profile' from source: play vars 16142 1727204116.77439: variable 'dhcp_interface1' from source: play vars 16142 1727204116.77497: variable 'dhcp_interface1' from source: play vars 16142 1727204116.77507: variable 'controller_profile' from source: play vars 16142 1727204116.77575: variable 'controller_profile' from source: play vars 16142 1727204116.77585: variable 'port2_profile' from source: play vars 16142 1727204116.77653: variable 'port2_profile' from source: play vars 16142 1727204116.77670: variable 'dhcp_interface2' from source: play vars 16142 1727204116.77731: variable 'dhcp_interface2' from source: play vars 16142 1727204116.77749: variable 'controller_profile' from source: play vars 16142 1727204116.77809: variable 'controller_profile' from source: play vars 16142 1727204116.77821: variable 'ansible_distribution' from source: facts 16142 1727204116.77828: variable '__network_rh_distros' from source: role '' defaults 16142 1727204116.77839: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.77883: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204116.78085: variable 'ansible_distribution' from source: facts 16142 1727204116.78099: variable '__network_rh_distros' from source: role '' defaults 16142 1727204116.78109: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.78127: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204116.78324: variable 'ansible_distribution' from source: facts 16142 1727204116.78337: variable '__network_rh_distros' from source: role '' defaults 16142 1727204116.78348: variable 'ansible_distribution_major_version' from source: facts 16142 1727204116.78394: variable 'network_provider' from source: set_fact 16142 1727204116.78429: variable 'omit' from source: magic vars 16142 1727204116.78467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204116.78502: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204116.78539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204116.78565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204116.78582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204116.78620: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204116.78630: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204116.78647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204116.78771: Set connection var ansible_timeout to 10 16142 1727204116.78779: Set connection var ansible_connection to ssh 16142 1727204116.78790: Set connection var ansible_shell_type to sh 16142 1727204116.78799: Set connection var ansible_shell_executable to /bin/sh 16142 1727204116.78810: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204116.78823: Set connection var ansible_pipelining to False 16142 1727204116.78862: variable 'ansible_shell_executable' from source: unknown 16142 1727204116.78873: variable 'ansible_connection' from source: unknown 16142 1727204116.78880: variable 'ansible_module_compression' from source: unknown 16142 1727204116.78886: variable 'ansible_shell_type' from source: unknown 16142 1727204116.78893: variable 'ansible_shell_executable' from source: unknown 16142 1727204116.78900: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204116.78907: variable 'ansible_pipelining' from source: unknown 16142 1727204116.78914: variable 'ansible_timeout' from source: unknown 16142 1727204116.78922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204116.79046: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204116.79067: variable 'omit' from source: magic vars 16142 1727204116.79082: starting attempt loop 16142 1727204116.79089: running the handler 16142 1727204116.79177: variable 'ansible_facts' from source: unknown 16142 1727204116.80014: _low_level_execute_command(): starting 16142 1727204116.80030: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204116.80863: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204116.80883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204116.80901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204116.80929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204116.80978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204116.80991: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204116.81007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204116.81036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204116.81050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204116.81061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204116.81077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204116.81092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204116.81110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204116.81126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204116.81146: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204116.81161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204116.81245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204116.81276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204116.81294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204116.81387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204116.83043: stdout chunk (state=3): >>>/root <<< 16142 1727204116.83242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204116.83246: stdout chunk (state=3): >>><<< 16142 1727204116.83252: stderr chunk (state=3): >>><<< 16142 1727204116.83357: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204116.83361: _low_level_execute_command(): starting 16142 1727204116.83366: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866 `" && echo ansible-tmp-1727204116.8327188-17574-168527258179866="` echo /root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866 `" ) && sleep 0' 16142 1727204116.83978: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204116.83982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204116.84031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204116.84037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204116.84039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204116.84042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204116.84115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204116.84118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204116.84190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204116.86073: stdout chunk (state=3): >>>ansible-tmp-1727204116.8327188-17574-168527258179866=/root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866 <<< 16142 1727204116.86291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204116.86295: stdout chunk (state=3): >>><<< 16142 1727204116.86297: stderr chunk (state=3): >>><<< 16142 1727204116.86574: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204116.8327188-17574-168527258179866=/root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204116.86578: variable 'ansible_module_compression' from source: unknown 16142 1727204116.86582: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 16142 1727204116.86585: ANSIBALLZ: Acquiring lock 16142 1727204116.86587: ANSIBALLZ: Lock acquired: 140089297016096 16142 1727204116.86589: ANSIBALLZ: Creating module 16142 1727204117.17642: ANSIBALLZ: Writing module into payload 16142 1727204117.18072: ANSIBALLZ: Writing module 16142 1727204117.18077: ANSIBALLZ: Renaming module 16142 1727204117.18080: ANSIBALLZ: Done creating module 16142 1727204117.18082: variable 'ansible_facts' from source: unknown 16142 1727204117.18117: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866/AnsiballZ_systemd.py 16142 1727204117.18278: Sending initial data 16142 1727204117.18281: Sent initial data (156 bytes) 16142 1727204117.19213: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204117.19225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204117.19238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204117.19251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204117.19297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204117.19304: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204117.19314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204117.19328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204117.19337: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204117.19341: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204117.19349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204117.19359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204117.19375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204117.19382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204117.19389: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204117.19399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204117.19472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204117.19488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204117.19491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204117.19581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204117.21413: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204117.21451: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204117.21490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmphzlu1_9t /root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866/AnsiballZ_systemd.py <<< 16142 1727204117.21522: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204117.24349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204117.24453: stderr chunk (state=3): >>><<< 16142 1727204117.24456: stdout chunk (state=3): >>><<< 16142 1727204117.24484: done transferring module to remote 16142 1727204117.24495: _low_level_execute_command(): starting 16142 1727204117.24500: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866/ /root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866/AnsiballZ_systemd.py && sleep 0' 16142 1727204117.25161: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204117.25173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204117.25184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204117.25199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204117.25240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204117.25247: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204117.25258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204117.25274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204117.25282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204117.25289: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204117.25297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204117.25306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204117.25316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204117.25324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204117.25335: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204117.25338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204117.25410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204117.25424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204117.25440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204117.25513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204117.27552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204117.27638: stderr chunk (state=3): >>><<< 16142 1727204117.27642: stdout chunk (state=3): >>><<< 16142 1727204117.27666: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204117.27672: _low_level_execute_command(): starting 16142 1727204117.27678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866/AnsiballZ_systemd.py && sleep 0' 16142 1727204117.28301: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204117.28311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204117.28322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204117.28339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204117.28380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204117.28387: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204117.28398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204117.28412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204117.28420: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204117.28427: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204117.28437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204117.28444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204117.28456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204117.28463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204117.28472: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204117.28482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204117.28553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204117.28579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204117.28583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204117.28667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204117.53710: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 16142 1727204117.53778: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6717440", "MemoryAvailable": "infinity", "CPUUsageNSec": "775520000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdog<<< 16142 1727204117.53788: stdout chunk (state=3): >>>Signal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 16142 1727204117.55295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204117.55299: stdout chunk (state=3): >>><<< 16142 1727204117.55302: stderr chunk (state=3): >>><<< 16142 1727204117.55373: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6717440", "MemoryAvailable": "infinity", "CPUUsageNSec": "775520000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204117.55534: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204117.55570: _low_level_execute_command(): starting 16142 1727204117.55581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204116.8327188-17574-168527258179866/ > /dev/null 2>&1 && sleep 0' 16142 1727204117.56319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204117.56339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204117.56353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204117.56370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204117.56413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204117.56427: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204117.56447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204117.56463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204117.56475: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204117.56483: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204117.56493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204117.56503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204117.56515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204117.56524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204117.56539: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204117.56556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204117.56635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204117.56668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204117.56684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204117.56754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204117.58571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204117.58674: stderr chunk (state=3): >>><<< 16142 1727204117.58687: stdout chunk (state=3): >>><<< 16142 1727204117.58881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204117.58885: handler run complete 16142 1727204117.58887: attempt loop complete, returning result 16142 1727204117.58889: _execute() done 16142 1727204117.58891: dumping result to json 16142 1727204117.58893: done dumping result, returning 16142 1727204117.58895: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-fddd-f6c7-000000000032] 16142 1727204117.58897: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000032 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204117.59238: no more pending results, returning what we have 16142 1727204117.59241: results queue empty 16142 1727204117.59242: checking for any_errors_fatal 16142 1727204117.59247: done checking for any_errors_fatal 16142 1727204117.59248: checking for max_fail_percentage 16142 1727204117.59249: done checking for max_fail_percentage 16142 1727204117.59250: checking to see if all hosts have failed and the running result is not ok 16142 1727204117.59251: done checking to see if all hosts have failed 16142 1727204117.59251: getting the remaining hosts for this loop 16142 1727204117.59253: done getting the remaining hosts for this loop 16142 1727204117.59256: getting the next task for host managed-node2 16142 1727204117.59263: done getting next task for host managed-node2 16142 1727204117.59268: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204117.59271: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204117.59281: getting variables 16142 1727204117.59282: in VariableManager get_vars() 16142 1727204117.59331: Calling all_inventory to load vars for managed-node2 16142 1727204117.59333: Calling groups_inventory to load vars for managed-node2 16142 1727204117.59335: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204117.59345: Calling all_plugins_play to load vars for managed-node2 16142 1727204117.59347: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204117.59349: Calling groups_plugins_play to load vars for managed-node2 16142 1727204117.60073: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000032 16142 1727204117.60077: WORKER PROCESS EXITING 16142 1727204117.60938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204117.62720: done with get_vars() 16142 1727204117.62753: done getting variables 16142 1727204117.62824: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.993) 0:00:16.805 ***** 16142 1727204117.62874: entering _queue_task() for managed-node2/service 16142 1727204117.63493: worker is 1 (out of 1 available) 16142 1727204117.63506: exiting _queue_task() for managed-node2/service 16142 1727204117.63517: done queuing things up, now waiting for results queue to drain 16142 1727204117.63518: waiting for pending results... 16142 1727204117.63814: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204117.63940: in run() - task 0affcd87-79f5-fddd-f6c7-000000000033 16142 1727204117.63950: variable 'ansible_search_path' from source: unknown 16142 1727204117.63954: variable 'ansible_search_path' from source: unknown 16142 1727204117.64002: calling self._execute() 16142 1727204117.64105: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204117.64111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204117.64122: variable 'omit' from source: magic vars 16142 1727204117.64546: variable 'ansible_distribution_major_version' from source: facts 16142 1727204117.64560: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204117.64691: variable 'network_provider' from source: set_fact 16142 1727204117.64696: Evaluated conditional (network_provider == "nm"): True 16142 1727204117.64803: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204117.64900: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204117.65084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204117.67525: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204117.67597: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204117.67639: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204117.67681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204117.67711: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204117.67806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204117.67840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204117.67869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204117.67913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204117.67937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204117.67982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204117.68005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204117.68038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204117.68076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204117.68096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204117.68137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204117.68160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204117.68186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204117.68229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204117.68242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204117.68413: variable 'network_connections' from source: task vars 16142 1727204117.68429: variable 'controller_profile' from source: play vars 16142 1727204117.68507: variable 'controller_profile' from source: play vars 16142 1727204117.68513: variable 'controller_device' from source: play vars 16142 1727204117.68582: variable 'controller_device' from source: play vars 16142 1727204117.68591: variable 'port1_profile' from source: play vars 16142 1727204117.68678: variable 'port1_profile' from source: play vars 16142 1727204117.68681: variable 'dhcp_interface1' from source: play vars 16142 1727204117.69069: variable 'dhcp_interface1' from source: play vars 16142 1727204117.69072: variable 'controller_profile' from source: play vars 16142 1727204117.69074: variable 'controller_profile' from source: play vars 16142 1727204117.69077: variable 'port2_profile' from source: play vars 16142 1727204117.69079: variable 'port2_profile' from source: play vars 16142 1727204117.69080: variable 'dhcp_interface2' from source: play vars 16142 1727204117.69083: variable 'dhcp_interface2' from source: play vars 16142 1727204117.69085: variable 'controller_profile' from source: play vars 16142 1727204117.69087: variable 'controller_profile' from source: play vars 16142 1727204117.69089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204117.69260: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204117.69304: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204117.69338: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204117.69368: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204117.69411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204117.69436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204117.69465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204117.69489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204117.69546: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204117.69791: variable 'network_connections' from source: task vars 16142 1727204117.69795: variable 'controller_profile' from source: play vars 16142 1727204117.69856: variable 'controller_profile' from source: play vars 16142 1727204117.69863: variable 'controller_device' from source: play vars 16142 1727204117.69923: variable 'controller_device' from source: play vars 16142 1727204117.69937: variable 'port1_profile' from source: play vars 16142 1727204117.69994: variable 'port1_profile' from source: play vars 16142 1727204117.70001: variable 'dhcp_interface1' from source: play vars 16142 1727204117.70060: variable 'dhcp_interface1' from source: play vars 16142 1727204117.70065: variable 'controller_profile' from source: play vars 16142 1727204117.70135: variable 'controller_profile' from source: play vars 16142 1727204117.70139: variable 'port2_profile' from source: play vars 16142 1727204117.70211: variable 'port2_profile' from source: play vars 16142 1727204117.70218: variable 'dhcp_interface2' from source: play vars 16142 1727204117.70283: variable 'dhcp_interface2' from source: play vars 16142 1727204117.70289: variable 'controller_profile' from source: play vars 16142 1727204117.70353: variable 'controller_profile' from source: play vars 16142 1727204117.70405: Evaluated conditional (__network_wpa_supplicant_required): False 16142 1727204117.70408: when evaluation is False, skipping this task 16142 1727204117.70411: _execute() done 16142 1727204117.70418: dumping result to json 16142 1727204117.70422: done dumping result, returning 16142 1727204117.70431: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-fddd-f6c7-000000000033] 16142 1727204117.70436: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000033 16142 1727204117.70554: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000033 16142 1727204117.70557: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 16142 1727204117.70625: no more pending results, returning what we have 16142 1727204117.70631: results queue empty 16142 1727204117.70632: checking for any_errors_fatal 16142 1727204117.70655: done checking for any_errors_fatal 16142 1727204117.70656: checking for max_fail_percentage 16142 1727204117.70658: done checking for max_fail_percentage 16142 1727204117.70659: checking to see if all hosts have failed and the running result is not ok 16142 1727204117.70660: done checking to see if all hosts have failed 16142 1727204117.70661: getting the remaining hosts for this loop 16142 1727204117.70662: done getting the remaining hosts for this loop 16142 1727204117.70668: getting the next task for host managed-node2 16142 1727204117.70676: done getting next task for host managed-node2 16142 1727204117.70681: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204117.70683: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204117.70699: getting variables 16142 1727204117.70701: in VariableManager get_vars() 16142 1727204117.70762: Calling all_inventory to load vars for managed-node2 16142 1727204117.70767: Calling groups_inventory to load vars for managed-node2 16142 1727204117.70770: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204117.70782: Calling all_plugins_play to load vars for managed-node2 16142 1727204117.70785: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204117.70788: Calling groups_plugins_play to load vars for managed-node2 16142 1727204117.72470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204117.74172: done with get_vars() 16142 1727204117.74203: done getting variables 16142 1727204117.74274: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.114) 0:00:16.919 ***** 16142 1727204117.74309: entering _queue_task() for managed-node2/service 16142 1727204117.74670: worker is 1 (out of 1 available) 16142 1727204117.74684: exiting _queue_task() for managed-node2/service 16142 1727204117.74697: done queuing things up, now waiting for results queue to drain 16142 1727204117.74698: waiting for pending results... 16142 1727204117.75004: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204117.75134: in run() - task 0affcd87-79f5-fddd-f6c7-000000000034 16142 1727204117.75148: variable 'ansible_search_path' from source: unknown 16142 1727204117.75152: variable 'ansible_search_path' from source: unknown 16142 1727204117.75191: calling self._execute() 16142 1727204117.75289: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204117.75295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204117.75310: variable 'omit' from source: magic vars 16142 1727204117.75684: variable 'ansible_distribution_major_version' from source: facts 16142 1727204117.75699: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204117.75818: variable 'network_provider' from source: set_fact 16142 1727204117.75824: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204117.75827: when evaluation is False, skipping this task 16142 1727204117.75829: _execute() done 16142 1727204117.75834: dumping result to json 16142 1727204117.75837: done dumping result, returning 16142 1727204117.75843: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-fddd-f6c7-000000000034] 16142 1727204117.75854: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000034 16142 1727204117.75949: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000034 16142 1727204117.75953: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204117.76002: no more pending results, returning what we have 16142 1727204117.76007: results queue empty 16142 1727204117.76009: checking for any_errors_fatal 16142 1727204117.76022: done checking for any_errors_fatal 16142 1727204117.76023: checking for max_fail_percentage 16142 1727204117.76026: done checking for max_fail_percentage 16142 1727204117.76027: checking to see if all hosts have failed and the running result is not ok 16142 1727204117.76028: done checking to see if all hosts have failed 16142 1727204117.76028: getting the remaining hosts for this loop 16142 1727204117.76030: done getting the remaining hosts for this loop 16142 1727204117.76034: getting the next task for host managed-node2 16142 1727204117.76042: done getting next task for host managed-node2 16142 1727204117.76048: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204117.76051: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204117.76073: getting variables 16142 1727204117.76076: in VariableManager get_vars() 16142 1727204117.76139: Calling all_inventory to load vars for managed-node2 16142 1727204117.76142: Calling groups_inventory to load vars for managed-node2 16142 1727204117.76145: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204117.76159: Calling all_plugins_play to load vars for managed-node2 16142 1727204117.76161: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204117.76166: Calling groups_plugins_play to load vars for managed-node2 16142 1727204117.77985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204117.79699: done with get_vars() 16142 1727204117.79737: done getting variables 16142 1727204117.79802: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.055) 0:00:16.975 ***** 16142 1727204117.79837: entering _queue_task() for managed-node2/copy 16142 1727204117.80189: worker is 1 (out of 1 available) 16142 1727204117.80203: exiting _queue_task() for managed-node2/copy 16142 1727204117.80214: done queuing things up, now waiting for results queue to drain 16142 1727204117.80215: waiting for pending results... 16142 1727204117.80584: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204117.80871: in run() - task 0affcd87-79f5-fddd-f6c7-000000000035 16142 1727204117.80877: variable 'ansible_search_path' from source: unknown 16142 1727204117.80880: variable 'ansible_search_path' from source: unknown 16142 1727204117.80883: calling self._execute() 16142 1727204117.80887: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204117.80890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204117.80893: variable 'omit' from source: magic vars 16142 1727204117.81239: variable 'ansible_distribution_major_version' from source: facts 16142 1727204117.81253: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204117.81366: variable 'network_provider' from source: set_fact 16142 1727204117.81374: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204117.81377: when evaluation is False, skipping this task 16142 1727204117.81380: _execute() done 16142 1727204117.81385: dumping result to json 16142 1727204117.81388: done dumping result, returning 16142 1727204117.81397: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-fddd-f6c7-000000000035] 16142 1727204117.81404: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000035 16142 1727204117.81508: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000035 16142 1727204117.81511: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204117.81560: no more pending results, returning what we have 16142 1727204117.81566: results queue empty 16142 1727204117.81566: checking for any_errors_fatal 16142 1727204117.81572: done checking for any_errors_fatal 16142 1727204117.81572: checking for max_fail_percentage 16142 1727204117.81575: done checking for max_fail_percentage 16142 1727204117.81576: checking to see if all hosts have failed and the running result is not ok 16142 1727204117.81577: done checking to see if all hosts have failed 16142 1727204117.81578: getting the remaining hosts for this loop 16142 1727204117.81579: done getting the remaining hosts for this loop 16142 1727204117.81583: getting the next task for host managed-node2 16142 1727204117.81590: done getting next task for host managed-node2 16142 1727204117.81595: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204117.81598: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204117.81616: getting variables 16142 1727204117.81618: in VariableManager get_vars() 16142 1727204117.81675: Calling all_inventory to load vars for managed-node2 16142 1727204117.81678: Calling groups_inventory to load vars for managed-node2 16142 1727204117.81680: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204117.81691: Calling all_plugins_play to load vars for managed-node2 16142 1727204117.81693: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204117.81696: Calling groups_plugins_play to load vars for managed-node2 16142 1727204117.83361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204117.85185: done with get_vars() 16142 1727204117.85211: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.054) 0:00:17.029 ***** 16142 1727204117.85308: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204117.85310: Creating lock for fedora.linux_system_roles.network_connections 16142 1727204117.85664: worker is 1 (out of 1 available) 16142 1727204117.85680: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204117.85693: done queuing things up, now waiting for results queue to drain 16142 1727204117.85694: waiting for pending results... 16142 1727204117.85992: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204117.86117: in run() - task 0affcd87-79f5-fddd-f6c7-000000000036 16142 1727204117.86134: variable 'ansible_search_path' from source: unknown 16142 1727204117.86139: variable 'ansible_search_path' from source: unknown 16142 1727204117.86182: calling self._execute() 16142 1727204117.86280: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204117.86284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204117.86294: variable 'omit' from source: magic vars 16142 1727204117.86705: variable 'ansible_distribution_major_version' from source: facts 16142 1727204117.86719: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204117.86725: variable 'omit' from source: magic vars 16142 1727204117.86789: variable 'omit' from source: magic vars 16142 1727204117.86960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204117.89637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204117.89706: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204117.89753: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204117.89811: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204117.89844: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204117.89938: variable 'network_provider' from source: set_fact 16142 1727204117.90079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204117.90123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204117.90158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204117.90201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204117.90216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204117.90298: variable 'omit' from source: magic vars 16142 1727204117.90523: variable 'omit' from source: magic vars 16142 1727204117.90741: variable 'network_connections' from source: task vars 16142 1727204117.90754: variable 'controller_profile' from source: play vars 16142 1727204117.90938: variable 'controller_profile' from source: play vars 16142 1727204117.90945: variable 'controller_device' from source: play vars 16142 1727204117.91123: variable 'controller_device' from source: play vars 16142 1727204117.91136: variable 'port1_profile' from source: play vars 16142 1727204117.91191: variable 'port1_profile' from source: play vars 16142 1727204117.91197: variable 'dhcp_interface1' from source: play vars 16142 1727204117.91373: variable 'dhcp_interface1' from source: play vars 16142 1727204117.91379: variable 'controller_profile' from source: play vars 16142 1727204117.91556: variable 'controller_profile' from source: play vars 16142 1727204117.91564: variable 'port2_profile' from source: play vars 16142 1727204117.91625: variable 'port2_profile' from source: play vars 16142 1727204117.91634: variable 'dhcp_interface2' from source: play vars 16142 1727204117.91856: variable 'dhcp_interface2' from source: play vars 16142 1727204117.91862: variable 'controller_profile' from source: play vars 16142 1727204117.91931: variable 'controller_profile' from source: play vars 16142 1727204117.92188: variable 'omit' from source: magic vars 16142 1727204117.92216: variable '__lsr_ansible_managed' from source: task vars 16142 1727204117.92273: variable '__lsr_ansible_managed' from source: task vars 16142 1727204117.92470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 16142 1727204117.92709: Loaded config def from plugin (lookup/template) 16142 1727204117.92713: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 16142 1727204117.92746: File lookup term: get_ansible_managed.j2 16142 1727204117.92750: variable 'ansible_search_path' from source: unknown 16142 1727204117.92761: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 16142 1727204117.92774: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 16142 1727204117.92790: variable 'ansible_search_path' from source: unknown 16142 1727204118.03221: variable 'ansible_managed' from source: unknown 16142 1727204118.03599: variable 'omit' from source: magic vars 16142 1727204118.03630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204118.03655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204118.03677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204118.03814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204118.03823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204118.03854: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204118.03858: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204118.03860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204118.04087: Set connection var ansible_timeout to 10 16142 1727204118.04091: Set connection var ansible_connection to ssh 16142 1727204118.04093: Set connection var ansible_shell_type to sh 16142 1727204118.04100: Set connection var ansible_shell_executable to /bin/sh 16142 1727204118.04105: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204118.04118: Set connection var ansible_pipelining to False 16142 1727204118.04254: variable 'ansible_shell_executable' from source: unknown 16142 1727204118.04258: variable 'ansible_connection' from source: unknown 16142 1727204118.04261: variable 'ansible_module_compression' from source: unknown 16142 1727204118.04266: variable 'ansible_shell_type' from source: unknown 16142 1727204118.04269: variable 'ansible_shell_executable' from source: unknown 16142 1727204118.04271: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204118.04273: variable 'ansible_pipelining' from source: unknown 16142 1727204118.04275: variable 'ansible_timeout' from source: unknown 16142 1727204118.04279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204118.04530: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204118.04540: variable 'omit' from source: magic vars 16142 1727204118.04548: starting attempt loop 16142 1727204118.04673: running the handler 16142 1727204118.04687: _low_level_execute_command(): starting 16142 1727204118.04696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204118.06685: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.06855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.06859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.06902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204118.06906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204118.06920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.06925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.06930: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204118.06956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.07022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204118.07173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204118.07183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204118.07281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204118.08914: stdout chunk (state=3): >>>/root <<< 16142 1727204118.09118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204118.09122: stdout chunk (state=3): >>><<< 16142 1727204118.09125: stderr chunk (state=3): >>><<< 16142 1727204118.09254: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204118.09258: _low_level_execute_command(): starting 16142 1727204118.09261: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070 `" && echo ansible-tmp-1727204118.0915074-17620-176945063165070="` echo /root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070 `" ) && sleep 0' 16142 1727204118.10001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204118.10022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.10051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.10077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.10118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.10135: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204118.10152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.10174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204118.10187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204118.10198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204118.10211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.10224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.10246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.10259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.10277: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204118.10292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.10377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204118.10401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204118.10419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204118.10499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204118.12368: stdout chunk (state=3): >>>ansible-tmp-1727204118.0915074-17620-176945063165070=/root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070 <<< 16142 1727204118.12591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204118.12596: stdout chunk (state=3): >>><<< 16142 1727204118.12598: stderr chunk (state=3): >>><<< 16142 1727204118.13272: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204118.0915074-17620-176945063165070=/root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204118.13281: variable 'ansible_module_compression' from source: unknown 16142 1727204118.13284: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 16142 1727204118.13324: ANSIBALLZ: Acquiring lock 16142 1727204118.13327: ANSIBALLZ: Lock acquired: 140089290324016 16142 1727204118.13329: ANSIBALLZ: Creating module 16142 1727204118.39587: ANSIBALLZ: Writing module into payload 16142 1727204118.40048: ANSIBALLZ: Writing module 16142 1727204118.40083: ANSIBALLZ: Renaming module 16142 1727204118.40093: ANSIBALLZ: Done creating module 16142 1727204118.40122: variable 'ansible_facts' from source: unknown 16142 1727204118.40247: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070/AnsiballZ_network_connections.py 16142 1727204118.40406: Sending initial data 16142 1727204118.40409: Sent initial data (168 bytes) 16142 1727204118.41411: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204118.41425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.41446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.41468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.41511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.41523: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204118.41541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.41563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204118.41578: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204118.41589: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204118.41603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.41618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.41635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.41648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.41664: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204118.41680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.41753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204118.41782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204118.41799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204118.41886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204118.43704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204118.43738: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204118.43790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpsapeav1p /root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070/AnsiballZ_network_connections.py <<< 16142 1727204118.43825: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204118.45458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204118.45669: stderr chunk (state=3): >>><<< 16142 1727204118.45679: stdout chunk (state=3): >>><<< 16142 1727204118.45682: done transferring module to remote 16142 1727204118.45684: _low_level_execute_command(): starting 16142 1727204118.45686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070/ /root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070/AnsiballZ_network_connections.py && sleep 0' 16142 1727204118.46299: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204118.46313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.46332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.46352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.46399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.46413: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204118.46429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.46450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204118.46466: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204118.46479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204118.46493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.46507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.46523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.46537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.46553: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204118.46569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.46645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204118.46675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204118.46693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204118.46768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204118.48771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204118.48775: stderr chunk (state=3): >>><<< 16142 1727204118.48778: stdout chunk (state=3): >>><<< 16142 1727204118.48781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204118.48788: _low_level_execute_command(): starting 16142 1727204118.48790: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070/AnsiballZ_network_connections.py && sleep 0' 16142 1727204118.49458: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204118.49468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.49480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.49494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.49535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.49546: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204118.49557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.49573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204118.49685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204118.49692: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204118.49701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.49710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.49726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.49733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204118.49743: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204118.49752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.49825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204118.49842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204118.49853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204118.50083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204118.87726: stdout chunk (state=3): >>> <<< 16142 1727204118.87768: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 73afa86b-f147-47bf-9096-10366249563c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 73afa86b-f147-47bf-9096-10366249563c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 16142 1727204118.89856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204118.89863: stdout chunk (state=3): >>><<< 16142 1727204118.89868: stderr chunk (state=3): >>><<< 16142 1727204118.89896: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 73afa86b-f147-47bf-9096-10366249563c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 73afa86b-f147-47bf-9096-10366249563c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204118.89955: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204118.89967: _low_level_execute_command(): starting 16142 1727204118.89970: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204118.0915074-17620-176945063165070/ > /dev/null 2>&1 && sleep 0' 16142 1727204118.92603: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204118.92607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204118.92883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204118.92888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204118.92904: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204118.92909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204118.92926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204118.93008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204118.93280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204118.93296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204118.93362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204118.95276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204118.95339: stderr chunk (state=3): >>><<< 16142 1727204118.95342: stdout chunk (state=3): >>><<< 16142 1727204118.95358: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204118.95367: handler run complete 16142 1727204118.95410: attempt loop complete, returning result 16142 1727204118.95413: _execute() done 16142 1727204118.95416: dumping result to json 16142 1727204118.95422: done dumping result, returning 16142 1727204118.95436: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-fddd-f6c7-000000000036] 16142 1727204118.95438: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000036 16142 1727204118.95573: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000036 16142 1727204118.95576: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 73afa86b-f147-47bf-9096-10366249563c [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 73afa86b-f147-47bf-9096-10366249563c (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active) 16142 1727204118.95729: no more pending results, returning what we have 16142 1727204118.95735: results queue empty 16142 1727204118.95736: checking for any_errors_fatal 16142 1727204118.95742: done checking for any_errors_fatal 16142 1727204118.95743: checking for max_fail_percentage 16142 1727204118.95744: done checking for max_fail_percentage 16142 1727204118.95745: checking to see if all hosts have failed and the running result is not ok 16142 1727204118.95746: done checking to see if all hosts have failed 16142 1727204118.95746: getting the remaining hosts for this loop 16142 1727204118.95748: done getting the remaining hosts for this loop 16142 1727204118.95751: getting the next task for host managed-node2 16142 1727204118.95756: done getting next task for host managed-node2 16142 1727204118.95760: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204118.95762: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204118.95774: getting variables 16142 1727204118.95776: in VariableManager get_vars() 16142 1727204118.95822: Calling all_inventory to load vars for managed-node2 16142 1727204118.95825: Calling groups_inventory to load vars for managed-node2 16142 1727204118.95826: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204118.95837: Calling all_plugins_play to load vars for managed-node2 16142 1727204118.95840: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204118.95848: Calling groups_plugins_play to load vars for managed-node2 16142 1727204118.98672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204119.03214: done with get_vars() 16142 1727204119.03253: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:19 -0400 (0:00:01.180) 0:00:18.210 ***** 16142 1727204119.03339: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204119.03341: Creating lock for fedora.linux_system_roles.network_state 16142 1727204119.03692: worker is 1 (out of 1 available) 16142 1727204119.03708: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204119.03721: done queuing things up, now waiting for results queue to drain 16142 1727204119.03722: waiting for pending results... 16142 1727204119.04604: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204119.04980: in run() - task 0affcd87-79f5-fddd-f6c7-000000000037 16142 1727204119.05002: variable 'ansible_search_path' from source: unknown 16142 1727204119.05066: variable 'ansible_search_path' from source: unknown 16142 1727204119.05109: calling self._execute() 16142 1727204119.05255: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.05391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.05406: variable 'omit' from source: magic vars 16142 1727204119.06121: variable 'ansible_distribution_major_version' from source: facts 16142 1727204119.06269: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204119.06510: variable 'network_state' from source: role '' defaults 16142 1727204119.06524: Evaluated conditional (network_state != {}): False 16142 1727204119.06531: when evaluation is False, skipping this task 16142 1727204119.06540: _execute() done 16142 1727204119.06546: dumping result to json 16142 1727204119.06554: done dumping result, returning 16142 1727204119.06566: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-fddd-f6c7-000000000037] 16142 1727204119.06578: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000037 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204119.06850: no more pending results, returning what we have 16142 1727204119.06854: results queue empty 16142 1727204119.06855: checking for any_errors_fatal 16142 1727204119.06870: done checking for any_errors_fatal 16142 1727204119.06871: checking for max_fail_percentage 16142 1727204119.06873: done checking for max_fail_percentage 16142 1727204119.06874: checking to see if all hosts have failed and the running result is not ok 16142 1727204119.06874: done checking to see if all hosts have failed 16142 1727204119.06875: getting the remaining hosts for this loop 16142 1727204119.06876: done getting the remaining hosts for this loop 16142 1727204119.06880: getting the next task for host managed-node2 16142 1727204119.06886: done getting next task for host managed-node2 16142 1727204119.06890: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204119.06893: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204119.06910: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000037 16142 1727204119.06915: WORKER PROCESS EXITING 16142 1727204119.06923: getting variables 16142 1727204119.06925: in VariableManager get_vars() 16142 1727204119.06988: Calling all_inventory to load vars for managed-node2 16142 1727204119.06991: Calling groups_inventory to load vars for managed-node2 16142 1727204119.06993: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204119.07005: Calling all_plugins_play to load vars for managed-node2 16142 1727204119.07008: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204119.07011: Calling groups_plugins_play to load vars for managed-node2 16142 1727204119.09847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204119.14685: done with get_vars() 16142 1727204119.14724: done getting variables 16142 1727204119.14790: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.114) 0:00:18.325 ***** 16142 1727204119.14829: entering _queue_task() for managed-node2/debug 16142 1727204119.15161: worker is 1 (out of 1 available) 16142 1727204119.15176: exiting _queue_task() for managed-node2/debug 16142 1727204119.15187: done queuing things up, now waiting for results queue to drain 16142 1727204119.15189: waiting for pending results... 16142 1727204119.17139: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204119.17372: in run() - task 0affcd87-79f5-fddd-f6c7-000000000038 16142 1727204119.17398: variable 'ansible_search_path' from source: unknown 16142 1727204119.17407: variable 'ansible_search_path' from source: unknown 16142 1727204119.17455: calling self._execute() 16142 1727204119.17572: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.17586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.17605: variable 'omit' from source: magic vars 16142 1727204119.18142: variable 'ansible_distribution_major_version' from source: facts 16142 1727204119.18267: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204119.18281: variable 'omit' from source: magic vars 16142 1727204119.18357: variable 'omit' from source: magic vars 16142 1727204119.18500: variable 'omit' from source: magic vars 16142 1727204119.18549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204119.18709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204119.18744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204119.18768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204119.18786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204119.18827: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204119.18876: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.18885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.19009: Set connection var ansible_timeout to 10 16142 1727204119.19134: Set connection var ansible_connection to ssh 16142 1727204119.19179: Set connection var ansible_shell_type to sh 16142 1727204119.19191: Set connection var ansible_shell_executable to /bin/sh 16142 1727204119.19201: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204119.19214: Set connection var ansible_pipelining to False 16142 1727204119.19265: variable 'ansible_shell_executable' from source: unknown 16142 1727204119.19348: variable 'ansible_connection' from source: unknown 16142 1727204119.19357: variable 'ansible_module_compression' from source: unknown 16142 1727204119.19367: variable 'ansible_shell_type' from source: unknown 16142 1727204119.19377: variable 'ansible_shell_executable' from source: unknown 16142 1727204119.19384: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.19391: variable 'ansible_pipelining' from source: unknown 16142 1727204119.19397: variable 'ansible_timeout' from source: unknown 16142 1727204119.19404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.19773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204119.19794: variable 'omit' from source: magic vars 16142 1727204119.19805: starting attempt loop 16142 1727204119.19814: running the handler 16142 1727204119.20058: variable '__network_connections_result' from source: set_fact 16142 1727204119.20175: handler run complete 16142 1727204119.20347: attempt loop complete, returning result 16142 1727204119.20354: _execute() done 16142 1727204119.20359: dumping result to json 16142 1727204119.20368: done dumping result, returning 16142 1727204119.20379: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-fddd-f6c7-000000000038] 16142 1727204119.20388: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000038 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 73afa86b-f147-47bf-9096-10366249563c", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 73afa86b-f147-47bf-9096-10366249563c (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)" ] } 16142 1727204119.20558: no more pending results, returning what we have 16142 1727204119.20563: results queue empty 16142 1727204119.20565: checking for any_errors_fatal 16142 1727204119.20573: done checking for any_errors_fatal 16142 1727204119.20574: checking for max_fail_percentage 16142 1727204119.20576: done checking for max_fail_percentage 16142 1727204119.20577: checking to see if all hosts have failed and the running result is not ok 16142 1727204119.20578: done checking to see if all hosts have failed 16142 1727204119.20578: getting the remaining hosts for this loop 16142 1727204119.20580: done getting the remaining hosts for this loop 16142 1727204119.20585: getting the next task for host managed-node2 16142 1727204119.20592: done getting next task for host managed-node2 16142 1727204119.20596: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204119.20599: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204119.20614: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000038 16142 1727204119.20618: WORKER PROCESS EXITING 16142 1727204119.20624: getting variables 16142 1727204119.20626: in VariableManager get_vars() 16142 1727204119.20686: Calling all_inventory to load vars for managed-node2 16142 1727204119.20689: Calling groups_inventory to load vars for managed-node2 16142 1727204119.20692: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204119.20703: Calling all_plugins_play to load vars for managed-node2 16142 1727204119.20706: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204119.20709: Calling groups_plugins_play to load vars for managed-node2 16142 1727204119.23801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204119.27704: done with get_vars() 16142 1727204119.27740: done getting variables 16142 1727204119.27806: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.130) 0:00:18.455 ***** 16142 1727204119.27846: entering _queue_task() for managed-node2/debug 16142 1727204119.28168: worker is 1 (out of 1 available) 16142 1727204119.28183: exiting _queue_task() for managed-node2/debug 16142 1727204119.28195: done queuing things up, now waiting for results queue to drain 16142 1727204119.28196: waiting for pending results... 16142 1727204119.29889: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204119.30401: in run() - task 0affcd87-79f5-fddd-f6c7-000000000039 16142 1727204119.30592: variable 'ansible_search_path' from source: unknown 16142 1727204119.30602: variable 'ansible_search_path' from source: unknown 16142 1727204119.30646: calling self._execute() 16142 1727204119.30789: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.30802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.30820: variable 'omit' from source: magic vars 16142 1727204119.31295: variable 'ansible_distribution_major_version' from source: facts 16142 1727204119.31385: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204119.31418: variable 'omit' from source: magic vars 16142 1727204119.31486: variable 'omit' from source: magic vars 16142 1727204119.31742: variable 'omit' from source: magic vars 16142 1727204119.31790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204119.31830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204119.31887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204119.31971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204119.31987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204119.32095: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204119.32103: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.32110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.32391: Set connection var ansible_timeout to 10 16142 1727204119.32399: Set connection var ansible_connection to ssh 16142 1727204119.32409: Set connection var ansible_shell_type to sh 16142 1727204119.32418: Set connection var ansible_shell_executable to /bin/sh 16142 1727204119.32427: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204119.32440: Set connection var ansible_pipelining to False 16142 1727204119.32469: variable 'ansible_shell_executable' from source: unknown 16142 1727204119.32502: variable 'ansible_connection' from source: unknown 16142 1727204119.32509: variable 'ansible_module_compression' from source: unknown 16142 1727204119.32515: variable 'ansible_shell_type' from source: unknown 16142 1727204119.32521: variable 'ansible_shell_executable' from source: unknown 16142 1727204119.32527: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.32612: variable 'ansible_pipelining' from source: unknown 16142 1727204119.32619: variable 'ansible_timeout' from source: unknown 16142 1727204119.32626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.32889: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204119.32905: variable 'omit' from source: magic vars 16142 1727204119.32914: starting attempt loop 16142 1727204119.32935: running the handler 16142 1727204119.32989: variable '__network_connections_result' from source: set_fact 16142 1727204119.33253: variable '__network_connections_result' from source: set_fact 16142 1727204119.33554: handler run complete 16142 1727204119.33734: attempt loop complete, returning result 16142 1727204119.33743: _execute() done 16142 1727204119.33750: dumping result to json 16142 1727204119.33759: done dumping result, returning 16142 1727204119.33775: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-fddd-f6c7-000000000039] 16142 1727204119.33786: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000039 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 73afa86b-f147-47bf-9096-10366249563c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 73afa86b-f147-47bf-9096-10366249563c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 73afa86b-f147-47bf-9096-10366249563c", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 73afa86b-f147-47bf-9096-10366249563c (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)" ] } } 16142 1727204119.34030: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000039 16142 1727204119.34038: no more pending results, returning what we have 16142 1727204119.34042: results queue empty 16142 1727204119.34043: checking for any_errors_fatal 16142 1727204119.34052: WORKER PROCESS EXITING 16142 1727204119.34061: done checking for any_errors_fatal 16142 1727204119.34066: checking for max_fail_percentage 16142 1727204119.34069: done checking for max_fail_percentage 16142 1727204119.34070: checking to see if all hosts have failed and the running result is not ok 16142 1727204119.34071: done checking to see if all hosts have failed 16142 1727204119.34071: getting the remaining hosts for this loop 16142 1727204119.34073: done getting the remaining hosts for this loop 16142 1727204119.34078: getting the next task for host managed-node2 16142 1727204119.34085: done getting next task for host managed-node2 16142 1727204119.34090: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204119.34093: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204119.34106: getting variables 16142 1727204119.34108: in VariableManager get_vars() 16142 1727204119.34166: Calling all_inventory to load vars for managed-node2 16142 1727204119.34169: Calling groups_inventory to load vars for managed-node2 16142 1727204119.34171: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204119.34180: Calling all_plugins_play to load vars for managed-node2 16142 1727204119.34182: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204119.34185: Calling groups_plugins_play to load vars for managed-node2 16142 1727204119.35921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204119.37700: done with get_vars() 16142 1727204119.37736: done getting variables 16142 1727204119.37802: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.099) 0:00:18.555 ***** 16142 1727204119.37843: entering _queue_task() for managed-node2/debug 16142 1727204119.38192: worker is 1 (out of 1 available) 16142 1727204119.38207: exiting _queue_task() for managed-node2/debug 16142 1727204119.38220: done queuing things up, now waiting for results queue to drain 16142 1727204119.38222: waiting for pending results... 16142 1727204119.38529: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204119.38658: in run() - task 0affcd87-79f5-fddd-f6c7-00000000003a 16142 1727204119.38680: variable 'ansible_search_path' from source: unknown 16142 1727204119.38683: variable 'ansible_search_path' from source: unknown 16142 1727204119.38725: calling self._execute() 16142 1727204119.38821: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.38827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.38837: variable 'omit' from source: magic vars 16142 1727204119.39245: variable 'ansible_distribution_major_version' from source: facts 16142 1727204119.39258: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204119.39388: variable 'network_state' from source: role '' defaults 16142 1727204119.39399: Evaluated conditional (network_state != {}): False 16142 1727204119.39403: when evaluation is False, skipping this task 16142 1727204119.39406: _execute() done 16142 1727204119.39408: dumping result to json 16142 1727204119.39410: done dumping result, returning 16142 1727204119.39419: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-fddd-f6c7-00000000003a] 16142 1727204119.39435: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000003a skipping: [managed-node2] => { "false_condition": "network_state != {}" } 16142 1727204119.39589: no more pending results, returning what we have 16142 1727204119.39594: results queue empty 16142 1727204119.39595: checking for any_errors_fatal 16142 1727204119.39605: done checking for any_errors_fatal 16142 1727204119.39606: checking for max_fail_percentage 16142 1727204119.39608: done checking for max_fail_percentage 16142 1727204119.39609: checking to see if all hosts have failed and the running result is not ok 16142 1727204119.39610: done checking to see if all hosts have failed 16142 1727204119.39611: getting the remaining hosts for this loop 16142 1727204119.39613: done getting the remaining hosts for this loop 16142 1727204119.39617: getting the next task for host managed-node2 16142 1727204119.39625: done getting next task for host managed-node2 16142 1727204119.39629: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204119.39632: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204119.39651: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000003a 16142 1727204119.39655: WORKER PROCESS EXITING 16142 1727204119.39662: getting variables 16142 1727204119.39666: in VariableManager get_vars() 16142 1727204119.39725: Calling all_inventory to load vars for managed-node2 16142 1727204119.39728: Calling groups_inventory to load vars for managed-node2 16142 1727204119.39730: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204119.39742: Calling all_plugins_play to load vars for managed-node2 16142 1727204119.39745: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204119.39748: Calling groups_plugins_play to load vars for managed-node2 16142 1727204119.41367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204119.45216: done with get_vars() 16142 1727204119.45243: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.075) 0:00:18.630 ***** 16142 1727204119.45349: entering _queue_task() for managed-node2/ping 16142 1727204119.45351: Creating lock for ping 16142 1727204119.46397: worker is 1 (out of 1 available) 16142 1727204119.46414: exiting _queue_task() for managed-node2/ping 16142 1727204119.46426: done queuing things up, now waiting for results queue to drain 16142 1727204119.46427: waiting for pending results... 16142 1727204119.47373: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204119.47662: in run() - task 0affcd87-79f5-fddd-f6c7-00000000003b 16142 1727204119.47769: variable 'ansible_search_path' from source: unknown 16142 1727204119.47778: variable 'ansible_search_path' from source: unknown 16142 1727204119.47818: calling self._execute() 16142 1727204119.47950: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.48052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.48083: variable 'omit' from source: magic vars 16142 1727204119.49916: variable 'ansible_distribution_major_version' from source: facts 16142 1727204119.49940: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204119.49954: variable 'omit' from source: magic vars 16142 1727204119.50029: variable 'omit' from source: magic vars 16142 1727204119.50079: variable 'omit' from source: magic vars 16142 1727204119.50128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204119.50179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204119.50207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204119.50233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204119.50251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204119.50295: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204119.50378: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.50387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.50599: Set connection var ansible_timeout to 10 16142 1727204119.50609: Set connection var ansible_connection to ssh 16142 1727204119.50708: Set connection var ansible_shell_type to sh 16142 1727204119.50720: Set connection var ansible_shell_executable to /bin/sh 16142 1727204119.50730: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204119.50745: Set connection var ansible_pipelining to False 16142 1727204119.50775: variable 'ansible_shell_executable' from source: unknown 16142 1727204119.50782: variable 'ansible_connection' from source: unknown 16142 1727204119.50789: variable 'ansible_module_compression' from source: unknown 16142 1727204119.50796: variable 'ansible_shell_type' from source: unknown 16142 1727204119.50807: variable 'ansible_shell_executable' from source: unknown 16142 1727204119.50815: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204119.50822: variable 'ansible_pipelining' from source: unknown 16142 1727204119.50876: variable 'ansible_timeout' from source: unknown 16142 1727204119.50886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204119.51393: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204119.51410: variable 'omit' from source: magic vars 16142 1727204119.51421: starting attempt loop 16142 1727204119.51428: running the handler 16142 1727204119.51450: _low_level_execute_command(): starting 16142 1727204119.51504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204119.53702: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.53707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204119.53730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.53738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204119.53908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204119.53977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204119.53981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204119.54039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204119.55737: stdout chunk (state=3): >>>/root <<< 16142 1727204119.55834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204119.55924: stderr chunk (state=3): >>><<< 16142 1727204119.55927: stdout chunk (state=3): >>><<< 16142 1727204119.55972: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204119.55976: _low_level_execute_command(): starting 16142 1727204119.56068: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092 `" && echo ansible-tmp-1727204119.5595334-17679-226115108300092="` echo /root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092 `" ) && sleep 0' 16142 1727204119.57871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.57876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204119.57892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204119.58055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204119.58098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204119.58101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204119.58156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204119.60062: stdout chunk (state=3): >>>ansible-tmp-1727204119.5595334-17679-226115108300092=/root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092 <<< 16142 1727204119.60175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204119.60255: stderr chunk (state=3): >>><<< 16142 1727204119.60258: stdout chunk (state=3): >>><<< 16142 1727204119.60473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204119.5595334-17679-226115108300092=/root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204119.60476: variable 'ansible_module_compression' from source: unknown 16142 1727204119.60479: ANSIBALLZ: Using lock for ping 16142 1727204119.60481: ANSIBALLZ: Acquiring lock 16142 1727204119.60483: ANSIBALLZ: Lock acquired: 140089294878768 16142 1727204119.60485: ANSIBALLZ: Creating module 16142 1727204119.86477: ANSIBALLZ: Writing module into payload 16142 1727204119.86679: ANSIBALLZ: Writing module 16142 1727204119.86702: ANSIBALLZ: Renaming module 16142 1727204119.86706: ANSIBALLZ: Done creating module 16142 1727204119.86722: variable 'ansible_facts' from source: unknown 16142 1727204119.86909: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092/AnsiballZ_ping.py 16142 1727204119.87639: Sending initial data 16142 1727204119.87642: Sent initial data (153 bytes) 16142 1727204119.89415: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.89419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204119.89597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204119.89601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.89621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204119.89626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204119.89816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204119.89822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204119.89840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204119.90001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204119.91818: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204119.91839: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204119.91878: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmptrbycs00 /root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092/AnsiballZ_ping.py <<< 16142 1727204119.91913: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204119.93187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204119.93263: stderr chunk (state=3): >>><<< 16142 1727204119.93268: stdout chunk (state=3): >>><<< 16142 1727204119.93290: done transferring module to remote 16142 1727204119.93300: _low_level_execute_command(): starting 16142 1727204119.93305: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092/ /root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092/AnsiballZ_ping.py && sleep 0' 16142 1727204119.94940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204119.94946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204119.94957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.94973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204119.95023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204119.95029: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204119.95039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204119.95052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204119.95059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204119.95128: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204119.95137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204119.95147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.95159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204119.95169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204119.95176: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204119.95186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204119.95267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204119.95353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204119.95360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204119.95563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204119.97334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204119.97339: stdout chunk (state=3): >>><<< 16142 1727204119.97341: stderr chunk (state=3): >>><<< 16142 1727204119.97369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204119.97373: _low_level_execute_command(): starting 16142 1727204119.97375: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092/AnsiballZ_ping.py && sleep 0' 16142 1727204119.98545: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204119.98569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204119.99188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.99211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204119.99267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204119.99281: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204119.99296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204119.99315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204119.99330: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204119.99346: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204119.99359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204119.99376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204119.99393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204119.99406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204119.99418: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204119.99436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204119.99517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204119.99590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204119.99785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204119.99874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204120.12863: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 16142 1727204120.13903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204120.13907: stdout chunk (state=3): >>><<< 16142 1727204120.13909: stderr chunk (state=3): >>><<< 16142 1727204120.14051: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204120.14056: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204120.14059: _low_level_execute_command(): starting 16142 1727204120.14061: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204119.5595334-17679-226115108300092/ > /dev/null 2>&1 && sleep 0' 16142 1727204120.14978: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204120.15792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.15812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.15836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.15885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.15897: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204120.15910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.15928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204120.15942: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204120.15952: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204120.15963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.15984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.15999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.16009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.16027: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204120.16044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.16124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204120.16153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204120.16173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204120.16247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204120.18150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204120.18155: stdout chunk (state=3): >>><<< 16142 1727204120.18157: stderr chunk (state=3): >>><<< 16142 1727204120.18474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204120.18478: handler run complete 16142 1727204120.18480: attempt loop complete, returning result 16142 1727204120.18482: _execute() done 16142 1727204120.18484: dumping result to json 16142 1727204120.18486: done dumping result, returning 16142 1727204120.18489: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-fddd-f6c7-00000000003b] 16142 1727204120.18491: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000003b ok: [managed-node2] => { "changed": false, "ping": "pong" } 16142 1727204120.18628: no more pending results, returning what we have 16142 1727204120.18631: results queue empty 16142 1727204120.18632: checking for any_errors_fatal 16142 1727204120.18638: done checking for any_errors_fatal 16142 1727204120.18638: checking for max_fail_percentage 16142 1727204120.18640: done checking for max_fail_percentage 16142 1727204120.18641: checking to see if all hosts have failed and the running result is not ok 16142 1727204120.18641: done checking to see if all hosts have failed 16142 1727204120.18642: getting the remaining hosts for this loop 16142 1727204120.18643: done getting the remaining hosts for this loop 16142 1727204120.18647: getting the next task for host managed-node2 16142 1727204120.18656: done getting next task for host managed-node2 16142 1727204120.18658: ^ task is: TASK: meta (role_complete) 16142 1727204120.18661: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204120.18676: getting variables 16142 1727204120.18677: in VariableManager get_vars() 16142 1727204120.18727: Calling all_inventory to load vars for managed-node2 16142 1727204120.18730: Calling groups_inventory to load vars for managed-node2 16142 1727204120.18732: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204120.18742: Calling all_plugins_play to load vars for managed-node2 16142 1727204120.18745: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204120.18749: Calling groups_plugins_play to load vars for managed-node2 16142 1727204120.19319: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000003b 16142 1727204120.19324: WORKER PROCESS EXITING 16142 1727204120.21310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204120.25783: done with get_vars() 16142 1727204120.25821: done getting variables 16142 1727204120.26121: done queuing things up, now waiting for results queue to drain 16142 1727204120.26124: results queue empty 16142 1727204120.26125: checking for any_errors_fatal 16142 1727204120.26128: done checking for any_errors_fatal 16142 1727204120.26129: checking for max_fail_percentage 16142 1727204120.26130: done checking for max_fail_percentage 16142 1727204120.26131: checking to see if all hosts have failed and the running result is not ok 16142 1727204120.26132: done checking to see if all hosts have failed 16142 1727204120.26132: getting the remaining hosts for this loop 16142 1727204120.26133: done getting the remaining hosts for this loop 16142 1727204120.26136: getting the next task for host managed-node2 16142 1727204120.26142: done getting next task for host managed-node2 16142 1727204120.26145: ^ task is: TASK: Include the task 'get_interface_stat.yml' 16142 1727204120.26147: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204120.26150: getting variables 16142 1727204120.26151: in VariableManager get_vars() 16142 1727204120.26177: Calling all_inventory to load vars for managed-node2 16142 1727204120.26180: Calling groups_inventory to load vars for managed-node2 16142 1727204120.26181: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204120.26187: Calling all_plugins_play to load vars for managed-node2 16142 1727204120.26189: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204120.26191: Calling groups_plugins_play to load vars for managed-node2 16142 1727204120.28928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204120.30924: done with get_vars() 16142 1727204120.30962: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.857) 0:00:19.487 ***** 16142 1727204120.31056: entering _queue_task() for managed-node2/include_tasks 16142 1727204120.31496: worker is 1 (out of 1 available) 16142 1727204120.31514: exiting _queue_task() for managed-node2/include_tasks 16142 1727204120.31528: done queuing things up, now waiting for results queue to drain 16142 1727204120.31530: waiting for pending results... 16142 1727204120.31866: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 16142 1727204120.31983: in run() - task 0affcd87-79f5-fddd-f6c7-00000000006e 16142 1727204120.32003: variable 'ansible_search_path' from source: unknown 16142 1727204120.32007: variable 'ansible_search_path' from source: unknown 16142 1727204120.32053: calling self._execute() 16142 1727204120.32154: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.32158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.32176: variable 'omit' from source: magic vars 16142 1727204120.32576: variable 'ansible_distribution_major_version' from source: facts 16142 1727204120.32602: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204120.32606: _execute() done 16142 1727204120.32612: dumping result to json 16142 1727204120.32615: done dumping result, returning 16142 1727204120.32623: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-fddd-f6c7-00000000006e] 16142 1727204120.32629: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000006e 16142 1727204120.32739: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000006e 16142 1727204120.32742: WORKER PROCESS EXITING 16142 1727204120.32778: no more pending results, returning what we have 16142 1727204120.32784: in VariableManager get_vars() 16142 1727204120.32856: Calling all_inventory to load vars for managed-node2 16142 1727204120.32859: Calling groups_inventory to load vars for managed-node2 16142 1727204120.32862: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204120.32877: Calling all_plugins_play to load vars for managed-node2 16142 1727204120.32880: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204120.32883: Calling groups_plugins_play to load vars for managed-node2 16142 1727204120.34536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204120.36275: done with get_vars() 16142 1727204120.36297: variable 'ansible_search_path' from source: unknown 16142 1727204120.36299: variable 'ansible_search_path' from source: unknown 16142 1727204120.36341: we have included files to process 16142 1727204120.36343: generating all_blocks data 16142 1727204120.36345: done generating all_blocks data 16142 1727204120.36351: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204120.36352: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204120.36354: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16142 1727204120.36546: done processing included file 16142 1727204120.36548: iterating over new_blocks loaded from include file 16142 1727204120.36550: in VariableManager get_vars() 16142 1727204120.36580: done with get_vars() 16142 1727204120.36582: filtering new block on tags 16142 1727204120.36601: done filtering new block on tags 16142 1727204120.36603: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 16142 1727204120.36608: extending task lists for all hosts with included blocks 16142 1727204120.36720: done extending task lists 16142 1727204120.36721: done processing included files 16142 1727204120.36722: results queue empty 16142 1727204120.36723: checking for any_errors_fatal 16142 1727204120.36725: done checking for any_errors_fatal 16142 1727204120.36725: checking for max_fail_percentage 16142 1727204120.36727: done checking for max_fail_percentage 16142 1727204120.36727: checking to see if all hosts have failed and the running result is not ok 16142 1727204120.36728: done checking to see if all hosts have failed 16142 1727204120.36729: getting the remaining hosts for this loop 16142 1727204120.36730: done getting the remaining hosts for this loop 16142 1727204120.36732: getting the next task for host managed-node2 16142 1727204120.36737: done getting next task for host managed-node2 16142 1727204120.36739: ^ task is: TASK: Get stat for interface {{ interface }} 16142 1727204120.36742: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204120.36744: getting variables 16142 1727204120.36745: in VariableManager get_vars() 16142 1727204120.36771: Calling all_inventory to load vars for managed-node2 16142 1727204120.36773: Calling groups_inventory to load vars for managed-node2 16142 1727204120.36775: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204120.36780: Calling all_plugins_play to load vars for managed-node2 16142 1727204120.36783: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204120.36786: Calling groups_plugins_play to load vars for managed-node2 16142 1727204120.38185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204120.40737: done with get_vars() 16142 1727204120.40859: done getting variables 16142 1727204120.41189: variable 'interface' from source: task vars 16142 1727204120.41193: variable 'controller_device' from source: play vars 16142 1727204120.41412: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.105) 0:00:19.592 ***** 16142 1727204120.41603: entering _queue_task() for managed-node2/stat 16142 1727204120.42458: worker is 1 (out of 1 available) 16142 1727204120.42473: exiting _queue_task() for managed-node2/stat 16142 1727204120.42486: done queuing things up, now waiting for results queue to drain 16142 1727204120.43267: waiting for pending results... 16142 1727204120.43293: running TaskExecutor() for managed-node2/TASK: Get stat for interface nm-bond 16142 1727204120.43390: in run() - task 0affcd87-79f5-fddd-f6c7-000000000337 16142 1727204120.43415: variable 'ansible_search_path' from source: unknown 16142 1727204120.43423: variable 'ansible_search_path' from source: unknown 16142 1727204120.43471: calling self._execute() 16142 1727204120.43577: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.43587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.43601: variable 'omit' from source: magic vars 16142 1727204120.44003: variable 'ansible_distribution_major_version' from source: facts 16142 1727204120.44026: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204120.44042: variable 'omit' from source: magic vars 16142 1727204120.44110: variable 'omit' from source: magic vars 16142 1727204120.44223: variable 'interface' from source: task vars 16142 1727204120.44238: variable 'controller_device' from source: play vars 16142 1727204120.44316: variable 'controller_device' from source: play vars 16142 1727204120.44343: variable 'omit' from source: magic vars 16142 1727204120.44397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204120.44440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204120.44469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204120.44491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204120.44512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204120.44550: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204120.44559: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.44572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.44685: Set connection var ansible_timeout to 10 16142 1727204120.44693: Set connection var ansible_connection to ssh 16142 1727204120.44702: Set connection var ansible_shell_type to sh 16142 1727204120.44716: Set connection var ansible_shell_executable to /bin/sh 16142 1727204120.44734: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204120.44748: Set connection var ansible_pipelining to False 16142 1727204120.44779: variable 'ansible_shell_executable' from source: unknown 16142 1727204120.44788: variable 'ansible_connection' from source: unknown 16142 1727204120.44796: variable 'ansible_module_compression' from source: unknown 16142 1727204120.44803: variable 'ansible_shell_type' from source: unknown 16142 1727204120.44812: variable 'ansible_shell_executable' from source: unknown 16142 1727204120.44820: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.44836: variable 'ansible_pipelining' from source: unknown 16142 1727204120.44845: variable 'ansible_timeout' from source: unknown 16142 1727204120.44854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.45070: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204120.45086: variable 'omit' from source: magic vars 16142 1727204120.45096: starting attempt loop 16142 1727204120.45102: running the handler 16142 1727204120.45121: _low_level_execute_command(): starting 16142 1727204120.45135: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204120.45927: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204120.45948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.45968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.45988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.46040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.46053: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204120.46072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.46092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204120.46105: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204120.46117: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204120.46130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.46149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.46167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.46179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.46189: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204120.46202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.46288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204120.46309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204120.46322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204120.46400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204120.48071: stdout chunk (state=3): >>>/root <<< 16142 1727204120.48272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204120.48276: stdout chunk (state=3): >>><<< 16142 1727204120.48288: stderr chunk (state=3): >>><<< 16142 1727204120.48471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204120.48474: _low_level_execute_command(): starting 16142 1727204120.48477: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553 `" && echo ansible-tmp-1727204120.4830837-17715-219704971436553="` echo /root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553 `" ) && sleep 0' 16142 1727204120.49203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.49207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.49252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.49255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.49257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.49334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204120.49338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204120.49430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204120.49540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204120.51422: stdout chunk (state=3): >>>ansible-tmp-1727204120.4830837-17715-219704971436553=/root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553 <<< 16142 1727204120.51536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204120.51624: stderr chunk (state=3): >>><<< 16142 1727204120.51627: stdout chunk (state=3): >>><<< 16142 1727204120.51672: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204120.4830837-17715-219704971436553=/root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204120.51972: variable 'ansible_module_compression' from source: unknown 16142 1727204120.51975: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16142 1727204120.51977: variable 'ansible_facts' from source: unknown 16142 1727204120.51980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553/AnsiballZ_stat.py 16142 1727204120.52648: Sending initial data 16142 1727204120.52651: Sent initial data (153 bytes) 16142 1727204120.55202: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204120.55219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.55236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.55260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.55308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.55373: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204120.55389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.55408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204120.55421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204120.55435: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204120.55448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.55464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.55488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.55502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.55513: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204120.55526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.55708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204120.55734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204120.55752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204120.55824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204120.57556: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204120.57566: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204120.57607: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpynmi64p8 /root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553/AnsiballZ_stat.py <<< 16142 1727204120.57636: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204120.58943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204120.59030: stderr chunk (state=3): >>><<< 16142 1727204120.59036: stdout chunk (state=3): >>><<< 16142 1727204120.59055: done transferring module to remote 16142 1727204120.59067: _low_level_execute_command(): starting 16142 1727204120.59081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553/ /root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553/AnsiballZ_stat.py && sleep 0' 16142 1727204120.60738: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204120.60803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.60818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.60835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.60875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.60883: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204120.60894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.60917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204120.61001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204120.61013: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204120.61021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.61036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.61048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.61058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.61157: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204120.61160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.61170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204120.61243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204120.61249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204120.61469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204120.63200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204120.63203: stdout chunk (state=3): >>><<< 16142 1727204120.63211: stderr chunk (state=3): >>><<< 16142 1727204120.63231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204120.63237: _low_level_execute_command(): starting 16142 1727204120.63240: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553/AnsiballZ_stat.py && sleep 0' 16142 1727204120.65159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204120.65180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.65194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.65217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.65261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.65339: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204120.65351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.65371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204120.65380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204120.65387: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204120.65396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.65405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.65418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.65426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.65439: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204120.65451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.65539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204120.65667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204120.65670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204120.65786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204120.79113: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28088, "dev": 21, "nlink": 1, "atime": 1727204118.7295911, "mtime": 1727204118.7295911, "ctime": 1727204118.7295911, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16142 1727204120.80180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204120.80184: stdout chunk (state=3): >>><<< 16142 1727204120.80190: stderr chunk (state=3): >>><<< 16142 1727204120.80207: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28088, "dev": 21, "nlink": 1, "atime": 1727204118.7295911, "mtime": 1727204118.7295911, "ctime": 1727204118.7295911, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204120.80349: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204120.80353: _low_level_execute_command(): starting 16142 1727204120.80356: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204120.4830837-17715-219704971436553/ > /dev/null 2>&1 && sleep 0' 16142 1727204120.81082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204120.81884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.81904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.81923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.81972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.81991: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204120.82006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.82026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204120.82039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204120.82050: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204120.82283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204120.82299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204120.82317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204120.82330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204120.82342: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204120.82357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204120.82640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204120.82902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204120.83128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204120.84835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204120.84914: stderr chunk (state=3): >>><<< 16142 1727204120.84918: stdout chunk (state=3): >>><<< 16142 1727204120.84973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204120.84977: handler run complete 16142 1727204120.85172: attempt loop complete, returning result 16142 1727204120.85175: _execute() done 16142 1727204120.85178: dumping result to json 16142 1727204120.85180: done dumping result, returning 16142 1727204120.85182: done running TaskExecutor() for managed-node2/TASK: Get stat for interface nm-bond [0affcd87-79f5-fddd-f6c7-000000000337] 16142 1727204120.85184: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000337 16142 1727204120.85267: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000337 16142 1727204120.85272: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204118.7295911, "block_size": 4096, "blocks": 0, "ctime": 1727204118.7295911, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28088, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727204118.7295911, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 16142 1727204120.85366: no more pending results, returning what we have 16142 1727204120.85370: results queue empty 16142 1727204120.85371: checking for any_errors_fatal 16142 1727204120.85372: done checking for any_errors_fatal 16142 1727204120.85373: checking for max_fail_percentage 16142 1727204120.85374: done checking for max_fail_percentage 16142 1727204120.85375: checking to see if all hosts have failed and the running result is not ok 16142 1727204120.85376: done checking to see if all hosts have failed 16142 1727204120.85377: getting the remaining hosts for this loop 16142 1727204120.85378: done getting the remaining hosts for this loop 16142 1727204120.85382: getting the next task for host managed-node2 16142 1727204120.85390: done getting next task for host managed-node2 16142 1727204120.85393: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 16142 1727204120.85395: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204120.85400: getting variables 16142 1727204120.85401: in VariableManager get_vars() 16142 1727204120.85450: Calling all_inventory to load vars for managed-node2 16142 1727204120.85452: Calling groups_inventory to load vars for managed-node2 16142 1727204120.85459: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204120.85470: Calling all_plugins_play to load vars for managed-node2 16142 1727204120.85472: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204120.85475: Calling groups_plugins_play to load vars for managed-node2 16142 1727204120.87249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204120.90236: done with get_vars() 16142 1727204120.90273: done getting variables 16142 1727204120.90335: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204120.90457: variable 'interface' from source: task vars 16142 1727204120.90461: variable 'controller_device' from source: play vars 16142 1727204120.90532: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.489) 0:00:20.082 ***** 16142 1727204120.90569: entering _queue_task() for managed-node2/assert 16142 1727204120.90911: worker is 1 (out of 1 available) 16142 1727204120.90925: exiting _queue_task() for managed-node2/assert 16142 1727204120.90938: done queuing things up, now waiting for results queue to drain 16142 1727204120.90940: waiting for pending results... 16142 1727204120.91243: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'nm-bond' 16142 1727204120.91387: in run() - task 0affcd87-79f5-fddd-f6c7-00000000006f 16142 1727204120.91411: variable 'ansible_search_path' from source: unknown 16142 1727204120.91420: variable 'ansible_search_path' from source: unknown 16142 1727204120.91462: calling self._execute() 16142 1727204120.91570: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.91581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.91600: variable 'omit' from source: magic vars 16142 1727204120.91996: variable 'ansible_distribution_major_version' from source: facts 16142 1727204120.92016: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204120.92031: variable 'omit' from source: magic vars 16142 1727204120.92090: variable 'omit' from source: magic vars 16142 1727204120.92198: variable 'interface' from source: task vars 16142 1727204120.92207: variable 'controller_device' from source: play vars 16142 1727204120.92281: variable 'controller_device' from source: play vars 16142 1727204120.92305: variable 'omit' from source: magic vars 16142 1727204120.92350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204120.92396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204120.92421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204120.92440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204120.92454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204120.92494: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204120.92500: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.92506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.92609: Set connection var ansible_timeout to 10 16142 1727204120.92617: Set connection var ansible_connection to ssh 16142 1727204120.92625: Set connection var ansible_shell_type to sh 16142 1727204120.92633: Set connection var ansible_shell_executable to /bin/sh 16142 1727204120.92640: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204120.92650: Set connection var ansible_pipelining to False 16142 1727204120.92681: variable 'ansible_shell_executable' from source: unknown 16142 1727204120.92694: variable 'ansible_connection' from source: unknown 16142 1727204120.92705: variable 'ansible_module_compression' from source: unknown 16142 1727204120.92712: variable 'ansible_shell_type' from source: unknown 16142 1727204120.92719: variable 'ansible_shell_executable' from source: unknown 16142 1727204120.92726: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.92735: variable 'ansible_pipelining' from source: unknown 16142 1727204120.92743: variable 'ansible_timeout' from source: unknown 16142 1727204120.92751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.92886: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204120.92910: variable 'omit' from source: magic vars 16142 1727204120.92927: starting attempt loop 16142 1727204120.92934: running the handler 16142 1727204120.93083: variable 'interface_stat' from source: set_fact 16142 1727204120.93108: Evaluated conditional (interface_stat.stat.exists): True 16142 1727204120.93121: handler run complete 16142 1727204120.93151: attempt loop complete, returning result 16142 1727204120.93158: _execute() done 16142 1727204120.93167: dumping result to json 16142 1727204120.93175: done dumping result, returning 16142 1727204120.93187: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'nm-bond' [0affcd87-79f5-fddd-f6c7-00000000006f] 16142 1727204120.93197: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000006f ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204120.93357: no more pending results, returning what we have 16142 1727204120.93362: results queue empty 16142 1727204120.93363: checking for any_errors_fatal 16142 1727204120.93378: done checking for any_errors_fatal 16142 1727204120.93379: checking for max_fail_percentage 16142 1727204120.93381: done checking for max_fail_percentage 16142 1727204120.93382: checking to see if all hosts have failed and the running result is not ok 16142 1727204120.93383: done checking to see if all hosts have failed 16142 1727204120.93384: getting the remaining hosts for this loop 16142 1727204120.93386: done getting the remaining hosts for this loop 16142 1727204120.93390: getting the next task for host managed-node2 16142 1727204120.93400: done getting next task for host managed-node2 16142 1727204120.93404: ^ task is: TASK: Include the task 'assert_profile_present.yml' 16142 1727204120.93406: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204120.93410: getting variables 16142 1727204120.93412: in VariableManager get_vars() 16142 1727204120.93482: Calling all_inventory to load vars for managed-node2 16142 1727204120.93485: Calling groups_inventory to load vars for managed-node2 16142 1727204120.93488: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204120.93499: Calling all_plugins_play to load vars for managed-node2 16142 1727204120.93502: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204120.93506: Calling groups_plugins_play to load vars for managed-node2 16142 1727204120.94484: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000006f 16142 1727204120.94488: WORKER PROCESS EXITING 16142 1727204120.95485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204120.97181: done with get_vars() 16142 1727204120.97211: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.067) 0:00:20.149 ***** 16142 1727204120.97314: entering _queue_task() for managed-node2/include_tasks 16142 1727204120.97661: worker is 1 (out of 1 available) 16142 1727204120.97677: exiting _queue_task() for managed-node2/include_tasks 16142 1727204120.97688: done queuing things up, now waiting for results queue to drain 16142 1727204120.97690: waiting for pending results... 16142 1727204120.97990: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' 16142 1727204120.98093: in run() - task 0affcd87-79f5-fddd-f6c7-000000000070 16142 1727204120.98111: variable 'ansible_search_path' from source: unknown 16142 1727204120.98174: variable 'controller_profile' from source: play vars 16142 1727204120.98367: variable 'controller_profile' from source: play vars 16142 1727204120.98387: variable 'port1_profile' from source: play vars 16142 1727204120.98466: variable 'port1_profile' from source: play vars 16142 1727204120.98484: variable 'port2_profile' from source: play vars 16142 1727204120.98551: variable 'port2_profile' from source: play vars 16142 1727204120.98582: variable 'omit' from source: magic vars 16142 1727204120.98735: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.98750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.98770: variable 'omit' from source: magic vars 16142 1727204120.99030: variable 'ansible_distribution_major_version' from source: facts 16142 1727204120.99045: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204120.99080: variable 'item' from source: unknown 16142 1727204120.99152: variable 'item' from source: unknown 16142 1727204120.99348: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.99366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.99381: variable 'omit' from source: magic vars 16142 1727204120.99555: variable 'ansible_distribution_major_version' from source: facts 16142 1727204120.99570: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204120.99601: variable 'item' from source: unknown 16142 1727204120.99673: variable 'item' from source: unknown 16142 1727204120.99813: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204120.99828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204120.99843: variable 'omit' from source: magic vars 16142 1727204121.00003: variable 'ansible_distribution_major_version' from source: facts 16142 1727204121.00014: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204121.00045: variable 'item' from source: unknown 16142 1727204121.00115: variable 'item' from source: unknown 16142 1727204121.00202: dumping result to json 16142 1727204121.00212: done dumping result, returning 16142 1727204121.00223: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' [0affcd87-79f5-fddd-f6c7-000000000070] 16142 1727204121.00234: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000070 16142 1727204121.00329: no more pending results, returning what we have 16142 1727204121.00336: in VariableManager get_vars() 16142 1727204121.00400: Calling all_inventory to load vars for managed-node2 16142 1727204121.00403: Calling groups_inventory to load vars for managed-node2 16142 1727204121.00406: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204121.00420: Calling all_plugins_play to load vars for managed-node2 16142 1727204121.00424: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204121.00427: Calling groups_plugins_play to load vars for managed-node2 16142 1727204121.01517: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000070 16142 1727204121.01520: WORKER PROCESS EXITING 16142 1727204121.02171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204121.08495: done with get_vars() 16142 1727204121.08522: variable 'ansible_search_path' from source: unknown 16142 1727204121.08540: variable 'ansible_search_path' from source: unknown 16142 1727204121.08549: variable 'ansible_search_path' from source: unknown 16142 1727204121.08555: we have included files to process 16142 1727204121.08556: generating all_blocks data 16142 1727204121.08557: done generating all_blocks data 16142 1727204121.08560: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.08561: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.08566: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.08749: in VariableManager get_vars() 16142 1727204121.08782: done with get_vars() 16142 1727204121.09010: done processing included file 16142 1727204121.09012: iterating over new_blocks loaded from include file 16142 1727204121.09014: in VariableManager get_vars() 16142 1727204121.09035: done with get_vars() 16142 1727204121.09037: filtering new block on tags 16142 1727204121.09060: done filtering new block on tags 16142 1727204121.09063: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0) 16142 1727204121.09069: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.09070: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.09073: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.09168: in VariableManager get_vars() 16142 1727204121.09197: done with get_vars() 16142 1727204121.09416: done processing included file 16142 1727204121.09418: iterating over new_blocks loaded from include file 16142 1727204121.09420: in VariableManager get_vars() 16142 1727204121.09445: done with get_vars() 16142 1727204121.09447: filtering new block on tags 16142 1727204121.09469: done filtering new block on tags 16142 1727204121.09471: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0.0) 16142 1727204121.09476: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.09477: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.09480: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16142 1727204121.09655: in VariableManager get_vars() 16142 1727204121.09687: done with get_vars() 16142 1727204121.09929: done processing included file 16142 1727204121.09931: iterating over new_blocks loaded from include file 16142 1727204121.09932: in VariableManager get_vars() 16142 1727204121.09956: done with get_vars() 16142 1727204121.09959: filtering new block on tags 16142 1727204121.09979: done filtering new block on tags 16142 1727204121.09981: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0.1) 16142 1727204121.09985: extending task lists for all hosts with included blocks 16142 1727204121.15830: done extending task lists 16142 1727204121.15837: done processing included files 16142 1727204121.15838: results queue empty 16142 1727204121.15839: checking for any_errors_fatal 16142 1727204121.15843: done checking for any_errors_fatal 16142 1727204121.15843: checking for max_fail_percentage 16142 1727204121.15845: done checking for max_fail_percentage 16142 1727204121.15845: checking to see if all hosts have failed and the running result is not ok 16142 1727204121.15846: done checking to see if all hosts have failed 16142 1727204121.15847: getting the remaining hosts for this loop 16142 1727204121.15848: done getting the remaining hosts for this loop 16142 1727204121.15850: getting the next task for host managed-node2 16142 1727204121.15855: done getting next task for host managed-node2 16142 1727204121.15857: ^ task is: TASK: Include the task 'get_profile_stat.yml' 16142 1727204121.15859: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204121.15862: getting variables 16142 1727204121.15863: in VariableManager get_vars() 16142 1727204121.15892: Calling all_inventory to load vars for managed-node2 16142 1727204121.15895: Calling groups_inventory to load vars for managed-node2 16142 1727204121.15897: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204121.15903: Calling all_plugins_play to load vars for managed-node2 16142 1727204121.15905: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204121.15909: Calling groups_plugins_play to load vars for managed-node2 16142 1727204121.17712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204121.19768: done with get_vars() 16142 1727204121.19791: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.225) 0:00:20.375 ***** 16142 1727204121.19874: entering _queue_task() for managed-node2/include_tasks 16142 1727204121.20236: worker is 1 (out of 1 available) 16142 1727204121.20248: exiting _queue_task() for managed-node2/include_tasks 16142 1727204121.20263: done queuing things up, now waiting for results queue to drain 16142 1727204121.20267: waiting for pending results... 16142 1727204121.20569: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 16142 1727204121.20689: in run() - task 0affcd87-79f5-fddd-f6c7-000000000355 16142 1727204121.20712: variable 'ansible_search_path' from source: unknown 16142 1727204121.20719: variable 'ansible_search_path' from source: unknown 16142 1727204121.20762: calling self._execute() 16142 1727204121.20918: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204121.20934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204121.20949: variable 'omit' from source: magic vars 16142 1727204121.23172: variable 'ansible_distribution_major_version' from source: facts 16142 1727204121.23289: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204121.23304: _execute() done 16142 1727204121.23317: dumping result to json 16142 1727204121.23428: done dumping result, returning 16142 1727204121.23445: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-fddd-f6c7-000000000355] 16142 1727204121.23458: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000355 16142 1727204121.23608: no more pending results, returning what we have 16142 1727204121.23616: in VariableManager get_vars() 16142 1727204121.23691: Calling all_inventory to load vars for managed-node2 16142 1727204121.23695: Calling groups_inventory to load vars for managed-node2 16142 1727204121.23698: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204121.23713: Calling all_plugins_play to load vars for managed-node2 16142 1727204121.23717: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204121.23721: Calling groups_plugins_play to load vars for managed-node2 16142 1727204121.24891: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000355 16142 1727204121.24896: WORKER PROCESS EXITING 16142 1727204121.25989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204121.30153: done with get_vars() 16142 1727204121.30185: variable 'ansible_search_path' from source: unknown 16142 1727204121.30187: variable 'ansible_search_path' from source: unknown 16142 1727204121.30229: we have included files to process 16142 1727204121.30230: generating all_blocks data 16142 1727204121.30235: done generating all_blocks data 16142 1727204121.30236: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204121.30238: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204121.30240: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204121.31544: done processing included file 16142 1727204121.31546: iterating over new_blocks loaded from include file 16142 1727204121.31548: in VariableManager get_vars() 16142 1727204121.31582: done with get_vars() 16142 1727204121.31589: filtering new block on tags 16142 1727204121.31615: done filtering new block on tags 16142 1727204121.31618: in VariableManager get_vars() 16142 1727204121.31649: done with get_vars() 16142 1727204121.31651: filtering new block on tags 16142 1727204121.31675: done filtering new block on tags 16142 1727204121.31678: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 16142 1727204121.31684: extending task lists for all hosts with included blocks 16142 1727204121.31854: done extending task lists 16142 1727204121.31856: done processing included files 16142 1727204121.31857: results queue empty 16142 1727204121.31857: checking for any_errors_fatal 16142 1727204121.31860: done checking for any_errors_fatal 16142 1727204121.31861: checking for max_fail_percentage 16142 1727204121.31862: done checking for max_fail_percentage 16142 1727204121.31863: checking to see if all hosts have failed and the running result is not ok 16142 1727204121.31866: done checking to see if all hosts have failed 16142 1727204121.31866: getting the remaining hosts for this loop 16142 1727204121.31868: done getting the remaining hosts for this loop 16142 1727204121.31870: getting the next task for host managed-node2 16142 1727204121.31874: done getting next task for host managed-node2 16142 1727204121.31877: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 16142 1727204121.31879: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204121.31882: getting variables 16142 1727204121.31883: in VariableManager get_vars() 16142 1727204121.31991: Calling all_inventory to load vars for managed-node2 16142 1727204121.31994: Calling groups_inventory to load vars for managed-node2 16142 1727204121.31997: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204121.32002: Calling all_plugins_play to load vars for managed-node2 16142 1727204121.32004: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204121.32007: Calling groups_plugins_play to load vars for managed-node2 16142 1727204121.33405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204121.35197: done with get_vars() 16142 1727204121.35227: done getting variables 16142 1727204121.35282: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.154) 0:00:20.530 ***** 16142 1727204121.35323: entering _queue_task() for managed-node2/set_fact 16142 1727204121.35695: worker is 1 (out of 1 available) 16142 1727204121.35707: exiting _queue_task() for managed-node2/set_fact 16142 1727204121.35719: done queuing things up, now waiting for results queue to drain 16142 1727204121.35721: waiting for pending results... 16142 1727204121.36028: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 16142 1727204121.36164: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005e4 16142 1727204121.36193: variable 'ansible_search_path' from source: unknown 16142 1727204121.36202: variable 'ansible_search_path' from source: unknown 16142 1727204121.36248: calling self._execute() 16142 1727204121.36355: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204121.36371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204121.36393: variable 'omit' from source: magic vars 16142 1727204121.36802: variable 'ansible_distribution_major_version' from source: facts 16142 1727204121.36829: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204121.36845: variable 'omit' from source: magic vars 16142 1727204121.36899: variable 'omit' from source: magic vars 16142 1727204121.36948: variable 'omit' from source: magic vars 16142 1727204121.36997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204121.37045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204121.37074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204121.37098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204121.37118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204121.37163: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204121.37173: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204121.37180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204121.37291: Set connection var ansible_timeout to 10 16142 1727204121.37300: Set connection var ansible_connection to ssh 16142 1727204121.37310: Set connection var ansible_shell_type to sh 16142 1727204121.37320: Set connection var ansible_shell_executable to /bin/sh 16142 1727204121.37330: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204121.37346: Set connection var ansible_pipelining to False 16142 1727204121.37379: variable 'ansible_shell_executable' from source: unknown 16142 1727204121.37387: variable 'ansible_connection' from source: unknown 16142 1727204121.37393: variable 'ansible_module_compression' from source: unknown 16142 1727204121.37398: variable 'ansible_shell_type' from source: unknown 16142 1727204121.37404: variable 'ansible_shell_executable' from source: unknown 16142 1727204121.37409: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204121.37415: variable 'ansible_pipelining' from source: unknown 16142 1727204121.37420: variable 'ansible_timeout' from source: unknown 16142 1727204121.37426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204121.37563: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204121.37586: variable 'omit' from source: magic vars 16142 1727204121.37599: starting attempt loop 16142 1727204121.37604: running the handler 16142 1727204121.37619: handler run complete 16142 1727204121.37634: attempt loop complete, returning result 16142 1727204121.37640: _execute() done 16142 1727204121.37645: dumping result to json 16142 1727204121.37651: done dumping result, returning 16142 1727204121.37659: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-fddd-f6c7-0000000005e4] 16142 1727204121.37670: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e4 16142 1727204121.37774: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e4 16142 1727204121.37780: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 16142 1727204121.37851: no more pending results, returning what we have 16142 1727204121.37856: results queue empty 16142 1727204121.37857: checking for any_errors_fatal 16142 1727204121.37858: done checking for any_errors_fatal 16142 1727204121.37859: checking for max_fail_percentage 16142 1727204121.37861: done checking for max_fail_percentage 16142 1727204121.37862: checking to see if all hosts have failed and the running result is not ok 16142 1727204121.37863: done checking to see if all hosts have failed 16142 1727204121.37865: getting the remaining hosts for this loop 16142 1727204121.37867: done getting the remaining hosts for this loop 16142 1727204121.37871: getting the next task for host managed-node2 16142 1727204121.37878: done getting next task for host managed-node2 16142 1727204121.37881: ^ task is: TASK: Stat profile file 16142 1727204121.37885: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204121.37890: getting variables 16142 1727204121.37892: in VariableManager get_vars() 16142 1727204121.37954: Calling all_inventory to load vars for managed-node2 16142 1727204121.37957: Calling groups_inventory to load vars for managed-node2 16142 1727204121.37959: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204121.37971: Calling all_plugins_play to load vars for managed-node2 16142 1727204121.37974: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204121.37976: Calling groups_plugins_play to load vars for managed-node2 16142 1727204121.39769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204121.41683: done with get_vars() 16142 1727204121.41719: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.065) 0:00:20.595 ***** 16142 1727204121.41830: entering _queue_task() for managed-node2/stat 16142 1727204121.42208: worker is 1 (out of 1 available) 16142 1727204121.42220: exiting _queue_task() for managed-node2/stat 16142 1727204121.42234: done queuing things up, now waiting for results queue to drain 16142 1727204121.42236: waiting for pending results... 16142 1727204121.42567: running TaskExecutor() for managed-node2/TASK: Stat profile file 16142 1727204121.42688: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005e5 16142 1727204121.42708: variable 'ansible_search_path' from source: unknown 16142 1727204121.42714: variable 'ansible_search_path' from source: unknown 16142 1727204121.42762: calling self._execute() 16142 1727204121.42870: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204121.42882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204121.42898: variable 'omit' from source: magic vars 16142 1727204121.43339: variable 'ansible_distribution_major_version' from source: facts 16142 1727204121.43361: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204121.43374: variable 'omit' from source: magic vars 16142 1727204121.43440: variable 'omit' from source: magic vars 16142 1727204121.43579: variable 'profile' from source: include params 16142 1727204121.43597: variable 'item' from source: include params 16142 1727204121.43685: variable 'item' from source: include params 16142 1727204121.43710: variable 'omit' from source: magic vars 16142 1727204121.43772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204121.43815: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204121.43849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204121.43872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204121.43892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204121.43936: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204121.43981: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204121.44002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204121.44305: Set connection var ansible_timeout to 10 16142 1727204121.44387: Set connection var ansible_connection to ssh 16142 1727204121.44399: Set connection var ansible_shell_type to sh 16142 1727204121.44409: Set connection var ansible_shell_executable to /bin/sh 16142 1727204121.44420: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204121.44440: Set connection var ansible_pipelining to False 16142 1727204121.44470: variable 'ansible_shell_executable' from source: unknown 16142 1727204121.44516: variable 'ansible_connection' from source: unknown 16142 1727204121.44546: variable 'ansible_module_compression' from source: unknown 16142 1727204121.44554: variable 'ansible_shell_type' from source: unknown 16142 1727204121.44562: variable 'ansible_shell_executable' from source: unknown 16142 1727204121.44586: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204121.44603: variable 'ansible_pipelining' from source: unknown 16142 1727204121.44657: variable 'ansible_timeout' from source: unknown 16142 1727204121.44669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204121.45252: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204121.45270: variable 'omit' from source: magic vars 16142 1727204121.45638: starting attempt loop 16142 1727204121.45642: running the handler 16142 1727204121.45659: _low_level_execute_command(): starting 16142 1727204121.45667: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204121.46420: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204121.46438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.46447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.46462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.46503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.46511: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204121.46521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.46538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204121.46547: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204121.46555: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204121.46562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.46576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.46588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.46596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.46603: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204121.46612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.46689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204121.46705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204121.46710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204121.46878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204121.48496: stdout chunk (state=3): >>>/root <<< 16142 1727204121.48670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204121.48679: stdout chunk (state=3): >>><<< 16142 1727204121.48689: stderr chunk (state=3): >>><<< 16142 1727204121.48713: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204121.48727: _low_level_execute_command(): starting 16142 1727204121.48736: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966 `" && echo ansible-tmp-1727204121.4871264-17758-108851910355966="` echo /root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966 `" ) && sleep 0' 16142 1727204121.50040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.50044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.50714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.50730: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204121.50749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.50773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204121.50785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204121.50796: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204121.50807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.50822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.50842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.50855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.50869: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204121.50884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.50971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204121.50997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204121.51014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204121.51097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204121.52957: stdout chunk (state=3): >>>ansible-tmp-1727204121.4871264-17758-108851910355966=/root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966 <<< 16142 1727204121.53169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204121.53173: stdout chunk (state=3): >>><<< 16142 1727204121.53176: stderr chunk (state=3): >>><<< 16142 1727204121.53469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204121.4871264-17758-108851910355966=/root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204121.53474: variable 'ansible_module_compression' from source: unknown 16142 1727204121.53476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16142 1727204121.53478: variable 'ansible_facts' from source: unknown 16142 1727204121.53480: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966/AnsiballZ_stat.py 16142 1727204121.53952: Sending initial data 16142 1727204121.53962: Sent initial data (153 bytes) 16142 1727204121.55811: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204121.56489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.56510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.56534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.56584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.56597: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204121.56611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.56630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204121.56647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204121.56659: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204121.56680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.56694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.56711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.56725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.56741: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204121.56756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.56842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204121.57490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204121.57508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204121.57579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204121.59292: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204121.59333: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204121.59373: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpfz83pd1a /root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966/AnsiballZ_stat.py <<< 16142 1727204121.59413: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204121.60796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204121.60889: stderr chunk (state=3): >>><<< 16142 1727204121.60892: stdout chunk (state=3): >>><<< 16142 1727204121.60917: done transferring module to remote 16142 1727204121.60928: _low_level_execute_command(): starting 16142 1727204121.60934: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966/ /root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966/AnsiballZ_stat.py && sleep 0' 16142 1727204121.62558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204121.62567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.62586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.62598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.62642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.62660: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204121.62680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.62695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204121.62773: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204121.62779: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204121.62787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.62796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.62807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.62815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.62821: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204121.62830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.62915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204121.62997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204121.63007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204121.63211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204121.64883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204121.64934: stderr chunk (state=3): >>><<< 16142 1727204121.64939: stdout chunk (state=3): >>><<< 16142 1727204121.64961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204121.64965: _low_level_execute_command(): starting 16142 1727204121.64971: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966/AnsiballZ_stat.py && sleep 0' 16142 1727204121.66035: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204121.66052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.66067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.66084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.66129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.66141: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204121.66158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.66178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204121.66188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204121.66197: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204121.66207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.66219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.66234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.66247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.66260: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204121.66282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.66350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204121.66374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204121.66392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204121.66473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204121.79583: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16142 1727204121.80592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204121.80684: stderr chunk (state=3): >>><<< 16142 1727204121.80688: stdout chunk (state=3): >>><<< 16142 1727204121.80774: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204121.80783: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204121.80786: _low_level_execute_command(): starting 16142 1727204121.80789: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204121.4871264-17758-108851910355966/ > /dev/null 2>&1 && sleep 0' 16142 1727204121.83359: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204121.83379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.83397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.83415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.83546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.83560: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204121.83580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.83599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204121.83617: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204121.83629: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204121.83642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204121.83656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204121.83676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204121.83688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204121.83699: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204121.83717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204121.83855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204121.83876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204121.83949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204121.84056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204121.85888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204121.85990: stderr chunk (state=3): >>><<< 16142 1727204121.85994: stdout chunk (state=3): >>><<< 16142 1727204121.86172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204121.86176: handler run complete 16142 1727204121.86178: attempt loop complete, returning result 16142 1727204121.86180: _execute() done 16142 1727204121.86182: dumping result to json 16142 1727204121.86184: done dumping result, returning 16142 1727204121.86185: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-fddd-f6c7-0000000005e5] 16142 1727204121.86187: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e5 16142 1727204121.86257: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e5 16142 1727204121.86260: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16142 1727204121.86321: no more pending results, returning what we have 16142 1727204121.86325: results queue empty 16142 1727204121.86326: checking for any_errors_fatal 16142 1727204121.86334: done checking for any_errors_fatal 16142 1727204121.86335: checking for max_fail_percentage 16142 1727204121.86338: done checking for max_fail_percentage 16142 1727204121.86339: checking to see if all hosts have failed and the running result is not ok 16142 1727204121.86340: done checking to see if all hosts have failed 16142 1727204121.86341: getting the remaining hosts for this loop 16142 1727204121.86342: done getting the remaining hosts for this loop 16142 1727204121.86346: getting the next task for host managed-node2 16142 1727204121.86353: done getting next task for host managed-node2 16142 1727204121.86356: ^ task is: TASK: Set NM profile exist flag based on the profile files 16142 1727204121.86361: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204121.86367: getting variables 16142 1727204121.86369: in VariableManager get_vars() 16142 1727204121.86424: Calling all_inventory to load vars for managed-node2 16142 1727204121.86428: Calling groups_inventory to load vars for managed-node2 16142 1727204121.86430: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204121.86441: Calling all_plugins_play to load vars for managed-node2 16142 1727204121.86444: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204121.86446: Calling groups_plugins_play to load vars for managed-node2 16142 1727204121.89312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204121.93048: done with get_vars() 16142 1727204121.93083: done getting variables 16142 1727204121.93149: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.513) 0:00:21.108 ***** 16142 1727204121.93190: entering _queue_task() for managed-node2/set_fact 16142 1727204121.93568: worker is 1 (out of 1 available) 16142 1727204121.93584: exiting _queue_task() for managed-node2/set_fact 16142 1727204121.93597: done queuing things up, now waiting for results queue to drain 16142 1727204121.93598: waiting for pending results... 16142 1727204121.94270: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 16142 1727204121.94394: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005e6 16142 1727204121.94416: variable 'ansible_search_path' from source: unknown 16142 1727204121.94424: variable 'ansible_search_path' from source: unknown 16142 1727204121.94478: calling self._execute() 16142 1727204121.94619: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204121.94636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204121.94661: variable 'omit' from source: magic vars 16142 1727204121.95139: variable 'ansible_distribution_major_version' from source: facts 16142 1727204121.95179: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204121.95310: variable 'profile_stat' from source: set_fact 16142 1727204121.95339: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204121.95347: when evaluation is False, skipping this task 16142 1727204121.95354: _execute() done 16142 1727204121.95361: dumping result to json 16142 1727204121.95370: done dumping result, returning 16142 1727204121.95380: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-fddd-f6c7-0000000005e6] 16142 1727204121.95390: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e6 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204121.95551: no more pending results, returning what we have 16142 1727204121.95555: results queue empty 16142 1727204121.95556: checking for any_errors_fatal 16142 1727204121.95566: done checking for any_errors_fatal 16142 1727204121.95567: checking for max_fail_percentage 16142 1727204121.95570: done checking for max_fail_percentage 16142 1727204121.95571: checking to see if all hosts have failed and the running result is not ok 16142 1727204121.95572: done checking to see if all hosts have failed 16142 1727204121.95572: getting the remaining hosts for this loop 16142 1727204121.95574: done getting the remaining hosts for this loop 16142 1727204121.95578: getting the next task for host managed-node2 16142 1727204121.95585: done getting next task for host managed-node2 16142 1727204121.95588: ^ task is: TASK: Get NM profile info 16142 1727204121.95592: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204121.95597: getting variables 16142 1727204121.95601: in VariableManager get_vars() 16142 1727204121.95672: Calling all_inventory to load vars for managed-node2 16142 1727204121.95675: Calling groups_inventory to load vars for managed-node2 16142 1727204121.95678: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204121.95692: Calling all_plugins_play to load vars for managed-node2 16142 1727204121.95695: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204121.95697: Calling groups_plugins_play to load vars for managed-node2 16142 1727204121.96742: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e6 16142 1727204121.96746: WORKER PROCESS EXITING 16142 1727204121.97604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.00237: done with get_vars() 16142 1727204122.00269: done getting variables 16142 1727204122.00458: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.073) 0:00:21.181 ***** 16142 1727204122.00494: entering _queue_task() for managed-node2/shell 16142 1727204122.01269: worker is 1 (out of 1 available) 16142 1727204122.01282: exiting _queue_task() for managed-node2/shell 16142 1727204122.01407: done queuing things up, now waiting for results queue to drain 16142 1727204122.01409: waiting for pending results... 16142 1727204122.01943: running TaskExecutor() for managed-node2/TASK: Get NM profile info 16142 1727204122.02076: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005e7 16142 1727204122.02101: variable 'ansible_search_path' from source: unknown 16142 1727204122.02110: variable 'ansible_search_path' from source: unknown 16142 1727204122.02154: calling self._execute() 16142 1727204122.02273: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.02291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.02306: variable 'omit' from source: magic vars 16142 1727204122.02714: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.02741: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.02758: variable 'omit' from source: magic vars 16142 1727204122.02813: variable 'omit' from source: magic vars 16142 1727204122.02939: variable 'profile' from source: include params 16142 1727204122.02952: variable 'item' from source: include params 16142 1727204122.03026: variable 'item' from source: include params 16142 1727204122.03058: variable 'omit' from source: magic vars 16142 1727204122.03110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204122.03152: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204122.03188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204122.03209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.03226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.03265: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204122.03278: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.03288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.03403: Set connection var ansible_timeout to 10 16142 1727204122.03411: Set connection var ansible_connection to ssh 16142 1727204122.03420: Set connection var ansible_shell_type to sh 16142 1727204122.03429: Set connection var ansible_shell_executable to /bin/sh 16142 1727204122.03441: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204122.03452: Set connection var ansible_pipelining to False 16142 1727204122.03490: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.03505: variable 'ansible_connection' from source: unknown 16142 1727204122.03529: variable 'ansible_module_compression' from source: unknown 16142 1727204122.03540: variable 'ansible_shell_type' from source: unknown 16142 1727204122.03560: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.03571: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.03582: variable 'ansible_pipelining' from source: unknown 16142 1727204122.03604: variable 'ansible_timeout' from source: unknown 16142 1727204122.03620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.03781: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204122.03806: variable 'omit' from source: magic vars 16142 1727204122.03822: starting attempt loop 16142 1727204122.03831: running the handler 16142 1727204122.03846: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204122.03871: _low_level_execute_command(): starting 16142 1727204122.03883: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204122.04745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204122.04762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.04781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.04810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.04862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204122.04877: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204122.04891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.04917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204122.04936: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204122.04947: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204122.04960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.04975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.04990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.05002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204122.05012: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204122.05034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.05114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204122.05141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204122.05156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204122.05248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204122.06904: stdout chunk (state=3): >>>/root <<< 16142 1727204122.07005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204122.07094: stderr chunk (state=3): >>><<< 16142 1727204122.07098: stdout chunk (state=3): >>><<< 16142 1727204122.07133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204122.07144: _low_level_execute_command(): starting 16142 1727204122.07148: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818 `" && echo ansible-tmp-1727204122.0712075-17788-166877589302818="` echo /root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818 `" ) && sleep 0' 16142 1727204122.07828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204122.07832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.07835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.07873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.07889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204122.07896: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204122.07906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.07920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204122.07927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204122.07934: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204122.07945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.07955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.07967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.07975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204122.07982: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204122.07990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.08068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204122.08088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204122.08101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204122.08381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204122.10081: stdout chunk (state=3): >>>ansible-tmp-1727204122.0712075-17788-166877589302818=/root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818 <<< 16142 1727204122.10270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204122.10285: stdout chunk (state=3): >>><<< 16142 1727204122.10290: stderr chunk (state=3): >>><<< 16142 1727204122.10311: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204122.0712075-17788-166877589302818=/root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204122.10350: variable 'ansible_module_compression' from source: unknown 16142 1727204122.10434: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204122.10475: variable 'ansible_facts' from source: unknown 16142 1727204122.10530: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818/AnsiballZ_command.py 16142 1727204122.10773: Sending initial data 16142 1727204122.10776: Sent initial data (156 bytes) 16142 1727204122.11671: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204122.11677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.11699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.11703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.11740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204122.11748: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204122.11758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.11773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204122.11781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204122.11789: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204122.11796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.11807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.11817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.11824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204122.11830: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204122.11844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.11934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204122.11939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204122.11944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204122.12020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204122.13775: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204122.13781: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204122.13819: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpxspbg8d2 /root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818/AnsiballZ_command.py <<< 16142 1727204122.13838: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204122.14817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204122.14930: stderr chunk (state=3): >>><<< 16142 1727204122.14933: stdout chunk (state=3): >>><<< 16142 1727204122.14954: done transferring module to remote 16142 1727204122.14962: _low_level_execute_command(): starting 16142 1727204122.14971: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818/ /root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818/AnsiballZ_command.py && sleep 0' 16142 1727204122.15430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.15438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.15468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204122.15483: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204122.15489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16142 1727204122.15496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204122.15502: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.15509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.15518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.15524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.15578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204122.15590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204122.15593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204122.15645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204122.18440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204122.18500: stderr chunk (state=3): >>><<< 16142 1727204122.18504: stdout chunk (state=3): >>><<< 16142 1727204122.18520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204122.18523: _low_level_execute_command(): starting 16142 1727204122.18529: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818/AnsiballZ_command.py && sleep 0' 16142 1727204122.19005: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.19009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.19052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204122.19057: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.19120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204122.19124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204122.19185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204122.35011: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:55:22.326389", "end": "2024-09-24 14:55:22.349324", "delta": "0:00:00.022935", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204122.36271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204122.36328: stderr chunk (state=3): >>><<< 16142 1727204122.36332: stdout chunk (state=3): >>><<< 16142 1727204122.36352: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:55:22.326389", "end": "2024-09-24 14:55:22.349324", "delta": "0:00:00.022935", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204122.36385: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204122.36392: _low_level_execute_command(): starting 16142 1727204122.36397: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204122.0712075-17788-166877589302818/ > /dev/null 2>&1 && sleep 0' 16142 1727204122.36873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.36882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204122.36918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.36923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204122.36931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204122.36945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204122.36955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204122.37000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204122.37013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204122.37022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204122.37075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204122.38856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204122.38916: stderr chunk (state=3): >>><<< 16142 1727204122.38919: stdout chunk (state=3): >>><<< 16142 1727204122.38939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204122.38944: handler run complete 16142 1727204122.38963: Evaluated conditional (False): False 16142 1727204122.38973: attempt loop complete, returning result 16142 1727204122.38975: _execute() done 16142 1727204122.38978: dumping result to json 16142 1727204122.38983: done dumping result, returning 16142 1727204122.39000: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-fddd-f6c7-0000000005e7] 16142 1727204122.39015: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e7 16142 1727204122.39114: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e7 16142 1727204122.39117: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.022935", "end": "2024-09-24 14:55:22.349324", "rc": 0, "start": "2024-09-24 14:55:22.326389" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 16142 1727204122.39217: no more pending results, returning what we have 16142 1727204122.39221: results queue empty 16142 1727204122.39222: checking for any_errors_fatal 16142 1727204122.39229: done checking for any_errors_fatal 16142 1727204122.39230: checking for max_fail_percentage 16142 1727204122.39232: done checking for max_fail_percentage 16142 1727204122.39232: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.39233: done checking to see if all hosts have failed 16142 1727204122.39234: getting the remaining hosts for this loop 16142 1727204122.39235: done getting the remaining hosts for this loop 16142 1727204122.39239: getting the next task for host managed-node2 16142 1727204122.39245: done getting next task for host managed-node2 16142 1727204122.39247: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16142 1727204122.39251: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.39254: getting variables 16142 1727204122.39256: in VariableManager get_vars() 16142 1727204122.39309: Calling all_inventory to load vars for managed-node2 16142 1727204122.39312: Calling groups_inventory to load vars for managed-node2 16142 1727204122.39314: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.39324: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.39326: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.39329: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.40256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.41955: done with get_vars() 16142 1727204122.41993: done getting variables 16142 1727204122.42060: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.416) 0:00:21.597 ***** 16142 1727204122.42101: entering _queue_task() for managed-node2/set_fact 16142 1727204122.42445: worker is 1 (out of 1 available) 16142 1727204122.42459: exiting _queue_task() for managed-node2/set_fact 16142 1727204122.42472: done queuing things up, now waiting for results queue to drain 16142 1727204122.42474: waiting for pending results... 16142 1727204122.42788: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16142 1727204122.42923: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005e8 16142 1727204122.42947: variable 'ansible_search_path' from source: unknown 16142 1727204122.42957: variable 'ansible_search_path' from source: unknown 16142 1727204122.43006: calling self._execute() 16142 1727204122.43116: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.43133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.43150: variable 'omit' from source: magic vars 16142 1727204122.43539: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.43557: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.43698: variable 'nm_profile_exists' from source: set_fact 16142 1727204122.43721: Evaluated conditional (nm_profile_exists.rc == 0): True 16142 1727204122.43732: variable 'omit' from source: magic vars 16142 1727204122.43788: variable 'omit' from source: magic vars 16142 1727204122.43825: variable 'omit' from source: magic vars 16142 1727204122.43872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204122.43917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204122.43946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204122.43973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.43990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.44029: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204122.44038: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.44045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.44154: Set connection var ansible_timeout to 10 16142 1727204122.44163: Set connection var ansible_connection to ssh 16142 1727204122.44177: Set connection var ansible_shell_type to sh 16142 1727204122.44186: Set connection var ansible_shell_executable to /bin/sh 16142 1727204122.44195: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204122.44205: Set connection var ansible_pipelining to False 16142 1727204122.44235: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.44242: variable 'ansible_connection' from source: unknown 16142 1727204122.44249: variable 'ansible_module_compression' from source: unknown 16142 1727204122.44255: variable 'ansible_shell_type' from source: unknown 16142 1727204122.44261: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.44270: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.44278: variable 'ansible_pipelining' from source: unknown 16142 1727204122.44283: variable 'ansible_timeout' from source: unknown 16142 1727204122.44291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.44444: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204122.44461: variable 'omit' from source: magic vars 16142 1727204122.44479: starting attempt loop 16142 1727204122.44487: running the handler 16142 1727204122.44511: handler run complete 16142 1727204122.44519: attempt loop complete, returning result 16142 1727204122.44522: _execute() done 16142 1727204122.44525: dumping result to json 16142 1727204122.44527: done dumping result, returning 16142 1727204122.44537: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-fddd-f6c7-0000000005e8] 16142 1727204122.44545: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e8 16142 1727204122.44629: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005e8 16142 1727204122.44631: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 16142 1727204122.44704: no more pending results, returning what we have 16142 1727204122.44708: results queue empty 16142 1727204122.44709: checking for any_errors_fatal 16142 1727204122.44717: done checking for any_errors_fatal 16142 1727204122.44718: checking for max_fail_percentage 16142 1727204122.44720: done checking for max_fail_percentage 16142 1727204122.44720: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.44721: done checking to see if all hosts have failed 16142 1727204122.44722: getting the remaining hosts for this loop 16142 1727204122.44723: done getting the remaining hosts for this loop 16142 1727204122.44727: getting the next task for host managed-node2 16142 1727204122.44740: done getting next task for host managed-node2 16142 1727204122.44743: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 16142 1727204122.44746: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.44751: getting variables 16142 1727204122.44753: in VariableManager get_vars() 16142 1727204122.44810: Calling all_inventory to load vars for managed-node2 16142 1727204122.44813: Calling groups_inventory to load vars for managed-node2 16142 1727204122.44815: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.44827: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.44830: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.44833: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.45796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.46901: done with get_vars() 16142 1727204122.46923: done getting variables 16142 1727204122.46972: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204122.47060: variable 'profile' from source: include params 16142 1727204122.47066: variable 'item' from source: include params 16142 1727204122.47108: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.050) 0:00:21.648 ***** 16142 1727204122.47136: entering _queue_task() for managed-node2/command 16142 1727204122.47444: worker is 1 (out of 1 available) 16142 1727204122.47458: exiting _queue_task() for managed-node2/command 16142 1727204122.47470: done queuing things up, now waiting for results queue to drain 16142 1727204122.47472: waiting for pending results... 16142 1727204122.47740: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0 16142 1727204122.47855: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005ea 16142 1727204122.47876: variable 'ansible_search_path' from source: unknown 16142 1727204122.47883: variable 'ansible_search_path' from source: unknown 16142 1727204122.47923: calling self._execute() 16142 1727204122.48024: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.48039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.48053: variable 'omit' from source: magic vars 16142 1727204122.48408: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.48427: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.48564: variable 'profile_stat' from source: set_fact 16142 1727204122.48586: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204122.48595: when evaluation is False, skipping this task 16142 1727204122.48603: _execute() done 16142 1727204122.48610: dumping result to json 16142 1727204122.48617: done dumping result, returning 16142 1727204122.48629: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affcd87-79f5-fddd-f6c7-0000000005ea] 16142 1727204122.48641: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005ea 16142 1727204122.48753: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005ea 16142 1727204122.48760: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204122.48830: no more pending results, returning what we have 16142 1727204122.48834: results queue empty 16142 1727204122.48835: checking for any_errors_fatal 16142 1727204122.48843: done checking for any_errors_fatal 16142 1727204122.48844: checking for max_fail_percentage 16142 1727204122.48846: done checking for max_fail_percentage 16142 1727204122.48846: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.48847: done checking to see if all hosts have failed 16142 1727204122.48848: getting the remaining hosts for this loop 16142 1727204122.48849: done getting the remaining hosts for this loop 16142 1727204122.48854: getting the next task for host managed-node2 16142 1727204122.48861: done getting next task for host managed-node2 16142 1727204122.48865: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 16142 1727204122.48869: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.48874: getting variables 16142 1727204122.48876: in VariableManager get_vars() 16142 1727204122.48937: Calling all_inventory to load vars for managed-node2 16142 1727204122.48940: Calling groups_inventory to load vars for managed-node2 16142 1727204122.48943: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.48955: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.48958: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.48961: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.50158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.51071: done with get_vars() 16142 1727204122.51089: done getting variables 16142 1727204122.51137: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204122.51223: variable 'profile' from source: include params 16142 1727204122.51226: variable 'item' from source: include params 16142 1727204122.51271: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.041) 0:00:21.689 ***** 16142 1727204122.51295: entering _queue_task() for managed-node2/set_fact 16142 1727204122.51642: worker is 1 (out of 1 available) 16142 1727204122.51653: exiting _queue_task() for managed-node2/set_fact 16142 1727204122.51666: done queuing things up, now waiting for results queue to drain 16142 1727204122.51667: waiting for pending results... 16142 1727204122.51940: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 16142 1727204122.52061: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005eb 16142 1727204122.52082: variable 'ansible_search_path' from source: unknown 16142 1727204122.52089: variable 'ansible_search_path' from source: unknown 16142 1727204122.52139: calling self._execute() 16142 1727204122.52373: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.52377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.52380: variable 'omit' from source: magic vars 16142 1727204122.52682: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.52695: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.52826: variable 'profile_stat' from source: set_fact 16142 1727204122.52842: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204122.52846: when evaluation is False, skipping this task 16142 1727204122.52849: _execute() done 16142 1727204122.52852: dumping result to json 16142 1727204122.52854: done dumping result, returning 16142 1727204122.52861: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affcd87-79f5-fddd-f6c7-0000000005eb] 16142 1727204122.52868: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005eb 16142 1727204122.52959: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005eb 16142 1727204122.52962: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204122.53018: no more pending results, returning what we have 16142 1727204122.53022: results queue empty 16142 1727204122.53023: checking for any_errors_fatal 16142 1727204122.53029: done checking for any_errors_fatal 16142 1727204122.53030: checking for max_fail_percentage 16142 1727204122.53032: done checking for max_fail_percentage 16142 1727204122.53033: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.53033: done checking to see if all hosts have failed 16142 1727204122.53034: getting the remaining hosts for this loop 16142 1727204122.53036: done getting the remaining hosts for this loop 16142 1727204122.53039: getting the next task for host managed-node2 16142 1727204122.53046: done getting next task for host managed-node2 16142 1727204122.53048: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 16142 1727204122.53052: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.53058: getting variables 16142 1727204122.53059: in VariableManager get_vars() 16142 1727204122.53113: Calling all_inventory to load vars for managed-node2 16142 1727204122.53116: Calling groups_inventory to load vars for managed-node2 16142 1727204122.53118: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.53127: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.53129: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.53132: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.54665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.56278: done with get_vars() 16142 1727204122.56304: done getting variables 16142 1727204122.56360: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204122.56477: variable 'profile' from source: include params 16142 1727204122.56481: variable 'item' from source: include params 16142 1727204122.56540: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.052) 0:00:21.742 ***** 16142 1727204122.56574: entering _queue_task() for managed-node2/command 16142 1727204122.56904: worker is 1 (out of 1 available) 16142 1727204122.56917: exiting _queue_task() for managed-node2/command 16142 1727204122.56928: done queuing things up, now waiting for results queue to drain 16142 1727204122.56929: waiting for pending results... 16142 1727204122.57206: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0 16142 1727204122.57321: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005ec 16142 1727204122.57338: variable 'ansible_search_path' from source: unknown 16142 1727204122.57345: variable 'ansible_search_path' from source: unknown 16142 1727204122.57387: calling self._execute() 16142 1727204122.57482: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.57492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.57503: variable 'omit' from source: magic vars 16142 1727204122.57845: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.57867: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.57992: variable 'profile_stat' from source: set_fact 16142 1727204122.58012: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204122.58024: when evaluation is False, skipping this task 16142 1727204122.58032: _execute() done 16142 1727204122.58040: dumping result to json 16142 1727204122.58048: done dumping result, returning 16142 1727204122.58057: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0 [0affcd87-79f5-fddd-f6c7-0000000005ec] 16142 1727204122.58071: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005ec skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204122.58209: no more pending results, returning what we have 16142 1727204122.58213: results queue empty 16142 1727204122.58214: checking for any_errors_fatal 16142 1727204122.58223: done checking for any_errors_fatal 16142 1727204122.58224: checking for max_fail_percentage 16142 1727204122.58226: done checking for max_fail_percentage 16142 1727204122.58227: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.58228: done checking to see if all hosts have failed 16142 1727204122.58228: getting the remaining hosts for this loop 16142 1727204122.58230: done getting the remaining hosts for this loop 16142 1727204122.58233: getting the next task for host managed-node2 16142 1727204122.58241: done getting next task for host managed-node2 16142 1727204122.58244: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 16142 1727204122.58248: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.58252: getting variables 16142 1727204122.58254: in VariableManager get_vars() 16142 1727204122.58311: Calling all_inventory to load vars for managed-node2 16142 1727204122.58313: Calling groups_inventory to load vars for managed-node2 16142 1727204122.58315: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.58327: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.58330: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.58332: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.59499: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005ec 16142 1727204122.59503: WORKER PROCESS EXITING 16142 1727204122.60033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.61662: done with get_vars() 16142 1727204122.61693: done getting variables 16142 1727204122.61751: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204122.61870: variable 'profile' from source: include params 16142 1727204122.61874: variable 'item' from source: include params 16142 1727204122.61933: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.053) 0:00:21.796 ***** 16142 1727204122.61967: entering _queue_task() for managed-node2/set_fact 16142 1727204122.62284: worker is 1 (out of 1 available) 16142 1727204122.62296: exiting _queue_task() for managed-node2/set_fact 16142 1727204122.62308: done queuing things up, now waiting for results queue to drain 16142 1727204122.62309: waiting for pending results... 16142 1727204122.62587: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0 16142 1727204122.62716: in run() - task 0affcd87-79f5-fddd-f6c7-0000000005ed 16142 1727204122.62735: variable 'ansible_search_path' from source: unknown 16142 1727204122.62742: variable 'ansible_search_path' from source: unknown 16142 1727204122.62788: calling self._execute() 16142 1727204122.62890: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.62901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.62915: variable 'omit' from source: magic vars 16142 1727204122.63274: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.63293: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.63423: variable 'profile_stat' from source: set_fact 16142 1727204122.63442: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204122.63449: when evaluation is False, skipping this task 16142 1727204122.63455: _execute() done 16142 1727204122.63463: dumping result to json 16142 1727204122.63472: done dumping result, returning 16142 1727204122.63482: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affcd87-79f5-fddd-f6c7-0000000005ed] 16142 1727204122.63491: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005ed 16142 1727204122.63601: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000005ed 16142 1727204122.63608: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204122.63674: no more pending results, returning what we have 16142 1727204122.63679: results queue empty 16142 1727204122.63680: checking for any_errors_fatal 16142 1727204122.63686: done checking for any_errors_fatal 16142 1727204122.63687: checking for max_fail_percentage 16142 1727204122.63689: done checking for max_fail_percentage 16142 1727204122.63690: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.63691: done checking to see if all hosts have failed 16142 1727204122.63691: getting the remaining hosts for this loop 16142 1727204122.63693: done getting the remaining hosts for this loop 16142 1727204122.63697: getting the next task for host managed-node2 16142 1727204122.63706: done getting next task for host managed-node2 16142 1727204122.63709: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 16142 1727204122.63712: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.63718: getting variables 16142 1727204122.63720: in VariableManager get_vars() 16142 1727204122.63785: Calling all_inventory to load vars for managed-node2 16142 1727204122.63789: Calling groups_inventory to load vars for managed-node2 16142 1727204122.63791: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.63806: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.63809: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.63813: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.65668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.67309: done with get_vars() 16142 1727204122.67339: done getting variables 16142 1727204122.67402: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204122.67529: variable 'profile' from source: include params 16142 1727204122.67533: variable 'item' from source: include params 16142 1727204122.67597: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.056) 0:00:21.853 ***** 16142 1727204122.67627: entering _queue_task() for managed-node2/assert 16142 1727204122.67948: worker is 1 (out of 1 available) 16142 1727204122.67967: exiting _queue_task() for managed-node2/assert 16142 1727204122.67979: done queuing things up, now waiting for results queue to drain 16142 1727204122.67981: waiting for pending results... 16142 1727204122.68299: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0' 16142 1727204122.68431: in run() - task 0affcd87-79f5-fddd-f6c7-000000000356 16142 1727204122.68452: variable 'ansible_search_path' from source: unknown 16142 1727204122.68460: variable 'ansible_search_path' from source: unknown 16142 1727204122.68510: calling self._execute() 16142 1727204122.68624: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.68641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.68656: variable 'omit' from source: magic vars 16142 1727204122.69049: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.69072: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.69085: variable 'omit' from source: magic vars 16142 1727204122.69126: variable 'omit' from source: magic vars 16142 1727204122.69228: variable 'profile' from source: include params 16142 1727204122.69238: variable 'item' from source: include params 16142 1727204122.69305: variable 'item' from source: include params 16142 1727204122.69328: variable 'omit' from source: magic vars 16142 1727204122.69374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204122.69417: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204122.69443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204122.69466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.69482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.69520: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204122.69528: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.69535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.69639: Set connection var ansible_timeout to 10 16142 1727204122.69647: Set connection var ansible_connection to ssh 16142 1727204122.69656: Set connection var ansible_shell_type to sh 16142 1727204122.69667: Set connection var ansible_shell_executable to /bin/sh 16142 1727204122.69677: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204122.69689: Set connection var ansible_pipelining to False 16142 1727204122.69714: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.69727: variable 'ansible_connection' from source: unknown 16142 1727204122.69734: variable 'ansible_module_compression' from source: unknown 16142 1727204122.69741: variable 'ansible_shell_type' from source: unknown 16142 1727204122.69746: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.69752: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.69757: variable 'ansible_pipelining' from source: unknown 16142 1727204122.69763: variable 'ansible_timeout' from source: unknown 16142 1727204122.69772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.69918: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204122.69937: variable 'omit' from source: magic vars 16142 1727204122.69953: starting attempt loop 16142 1727204122.69960: running the handler 16142 1727204122.70078: variable 'lsr_net_profile_exists' from source: set_fact 16142 1727204122.70089: Evaluated conditional (lsr_net_profile_exists): True 16142 1727204122.70099: handler run complete 16142 1727204122.70117: attempt loop complete, returning result 16142 1727204122.70123: _execute() done 16142 1727204122.70128: dumping result to json 16142 1727204122.70135: done dumping result, returning 16142 1727204122.70145: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0' [0affcd87-79f5-fddd-f6c7-000000000356] 16142 1727204122.70153: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000356 16142 1727204122.70254: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000356 16142 1727204122.70262: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204122.70316: no more pending results, returning what we have 16142 1727204122.70319: results queue empty 16142 1727204122.70320: checking for any_errors_fatal 16142 1727204122.70326: done checking for any_errors_fatal 16142 1727204122.70327: checking for max_fail_percentage 16142 1727204122.70329: done checking for max_fail_percentage 16142 1727204122.70330: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.70331: done checking to see if all hosts have failed 16142 1727204122.70332: getting the remaining hosts for this loop 16142 1727204122.70333: done getting the remaining hosts for this loop 16142 1727204122.70336: getting the next task for host managed-node2 16142 1727204122.70344: done getting next task for host managed-node2 16142 1727204122.70347: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 16142 1727204122.70349: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.70353: getting variables 16142 1727204122.70355: in VariableManager get_vars() 16142 1727204122.70417: Calling all_inventory to load vars for managed-node2 16142 1727204122.70420: Calling groups_inventory to load vars for managed-node2 16142 1727204122.70422: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.70433: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.70436: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.70438: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.72153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.73985: done with get_vars() 16142 1727204122.74011: done getting variables 16142 1727204122.74078: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204122.74202: variable 'profile' from source: include params 16142 1727204122.74206: variable 'item' from source: include params 16142 1727204122.74271: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.066) 0:00:21.919 ***** 16142 1727204122.74308: entering _queue_task() for managed-node2/assert 16142 1727204122.74631: worker is 1 (out of 1 available) 16142 1727204122.74654: exiting _queue_task() for managed-node2/assert 16142 1727204122.74668: done queuing things up, now waiting for results queue to drain 16142 1727204122.74669: waiting for pending results... 16142 1727204122.74968: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0' 16142 1727204122.75090: in run() - task 0affcd87-79f5-fddd-f6c7-000000000357 16142 1727204122.75122: variable 'ansible_search_path' from source: unknown 16142 1727204122.75132: variable 'ansible_search_path' from source: unknown 16142 1727204122.75183: calling self._execute() 16142 1727204122.75406: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.75419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.75551: variable 'omit' from source: magic vars 16142 1727204122.76276: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.76453: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.76471: variable 'omit' from source: magic vars 16142 1727204122.76518: variable 'omit' from source: magic vars 16142 1727204122.76633: variable 'profile' from source: include params 16142 1727204122.76763: variable 'item' from source: include params 16142 1727204122.76831: variable 'item' from source: include params 16142 1727204122.76987: variable 'omit' from source: magic vars 16142 1727204122.77032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204122.77085: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204122.77117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204122.77141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.77158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.77208: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204122.77218: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.77226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.77352: Set connection var ansible_timeout to 10 16142 1727204122.77361: Set connection var ansible_connection to ssh 16142 1727204122.77391: Set connection var ansible_shell_type to sh 16142 1727204122.77407: Set connection var ansible_shell_executable to /bin/sh 16142 1727204122.77520: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204122.77540: Set connection var ansible_pipelining to False 16142 1727204122.77573: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.77582: variable 'ansible_connection' from source: unknown 16142 1727204122.77588: variable 'ansible_module_compression' from source: unknown 16142 1727204122.77594: variable 'ansible_shell_type' from source: unknown 16142 1727204122.77600: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.77606: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.77614: variable 'ansible_pipelining' from source: unknown 16142 1727204122.77631: variable 'ansible_timeout' from source: unknown 16142 1727204122.77651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.77797: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204122.77813: variable 'omit' from source: magic vars 16142 1727204122.77822: starting attempt loop 16142 1727204122.77834: running the handler 16142 1727204122.77966: variable 'lsr_net_profile_ansible_managed' from source: set_fact 16142 1727204122.77979: Evaluated conditional (lsr_net_profile_ansible_managed): True 16142 1727204122.77988: handler run complete 16142 1727204122.78006: attempt loop complete, returning result 16142 1727204122.78013: _execute() done 16142 1727204122.78019: dumping result to json 16142 1727204122.78025: done dumping result, returning 16142 1727204122.78045: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0' [0affcd87-79f5-fddd-f6c7-000000000357] 16142 1727204122.78078: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000357 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204122.78243: no more pending results, returning what we have 16142 1727204122.78247: results queue empty 16142 1727204122.78248: checking for any_errors_fatal 16142 1727204122.78257: done checking for any_errors_fatal 16142 1727204122.78258: checking for max_fail_percentage 16142 1727204122.78260: done checking for max_fail_percentage 16142 1727204122.78260: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.78261: done checking to see if all hosts have failed 16142 1727204122.78262: getting the remaining hosts for this loop 16142 1727204122.78265: done getting the remaining hosts for this loop 16142 1727204122.78270: getting the next task for host managed-node2 16142 1727204122.78278: done getting next task for host managed-node2 16142 1727204122.78281: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 16142 1727204122.78285: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.78288: getting variables 16142 1727204122.78291: in VariableManager get_vars() 16142 1727204122.78365: Calling all_inventory to load vars for managed-node2 16142 1727204122.78370: Calling groups_inventory to load vars for managed-node2 16142 1727204122.78373: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.78387: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.78391: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.78394: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.79532: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000357 16142 1727204122.79539: WORKER PROCESS EXITING 16142 1727204122.80635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.82944: done with get_vars() 16142 1727204122.82984: done getting variables 16142 1727204122.83052: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204122.83175: variable 'profile' from source: include params 16142 1727204122.83179: variable 'item' from source: include params 16142 1727204122.83244: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.089) 0:00:22.009 ***** 16142 1727204122.83284: entering _queue_task() for managed-node2/assert 16142 1727204122.83614: worker is 1 (out of 1 available) 16142 1727204122.83627: exiting _queue_task() for managed-node2/assert 16142 1727204122.83640: done queuing things up, now waiting for results queue to drain 16142 1727204122.83641: waiting for pending results... 16142 1727204122.83935: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0 16142 1727204122.84052: in run() - task 0affcd87-79f5-fddd-f6c7-000000000358 16142 1727204122.84076: variable 'ansible_search_path' from source: unknown 16142 1727204122.84086: variable 'ansible_search_path' from source: unknown 16142 1727204122.84127: calling self._execute() 16142 1727204122.84449: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.84461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.84477: variable 'omit' from source: magic vars 16142 1727204122.84877: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.84896: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.84908: variable 'omit' from source: magic vars 16142 1727204122.84959: variable 'omit' from source: magic vars 16142 1727204122.85083: variable 'profile' from source: include params 16142 1727204122.85092: variable 'item' from source: include params 16142 1727204122.85175: variable 'item' from source: include params 16142 1727204122.85197: variable 'omit' from source: magic vars 16142 1727204122.85245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204122.85288: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204122.85315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204122.85338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.85359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204122.85397: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204122.85407: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.85414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.85531: Set connection var ansible_timeout to 10 16142 1727204122.85541: Set connection var ansible_connection to ssh 16142 1727204122.85551: Set connection var ansible_shell_type to sh 16142 1727204122.85560: Set connection var ansible_shell_executable to /bin/sh 16142 1727204122.85571: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204122.85582: Set connection var ansible_pipelining to False 16142 1727204122.85614: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.85623: variable 'ansible_connection' from source: unknown 16142 1727204122.85630: variable 'ansible_module_compression' from source: unknown 16142 1727204122.85640: variable 'ansible_shell_type' from source: unknown 16142 1727204122.85647: variable 'ansible_shell_executable' from source: unknown 16142 1727204122.85654: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.85661: variable 'ansible_pipelining' from source: unknown 16142 1727204122.85670: variable 'ansible_timeout' from source: unknown 16142 1727204122.85677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.85880: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204122.85895: variable 'omit' from source: magic vars 16142 1727204122.85904: starting attempt loop 16142 1727204122.85911: running the handler 16142 1727204122.86092: variable 'lsr_net_profile_fingerprint' from source: set_fact 16142 1727204122.86102: Evaluated conditional (lsr_net_profile_fingerprint): True 16142 1727204122.86111: handler run complete 16142 1727204122.86130: attempt loop complete, returning result 16142 1727204122.86144: _execute() done 16142 1727204122.86150: dumping result to json 16142 1727204122.86157: done dumping result, returning 16142 1727204122.86169: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0 [0affcd87-79f5-fddd-f6c7-000000000358] 16142 1727204122.86180: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000358 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204122.86327: no more pending results, returning what we have 16142 1727204122.86331: results queue empty 16142 1727204122.86335: checking for any_errors_fatal 16142 1727204122.86342: done checking for any_errors_fatal 16142 1727204122.86343: checking for max_fail_percentage 16142 1727204122.86345: done checking for max_fail_percentage 16142 1727204122.86346: checking to see if all hosts have failed and the running result is not ok 16142 1727204122.86347: done checking to see if all hosts have failed 16142 1727204122.86348: getting the remaining hosts for this loop 16142 1727204122.86349: done getting the remaining hosts for this loop 16142 1727204122.86353: getting the next task for host managed-node2 16142 1727204122.86367: done getting next task for host managed-node2 16142 1727204122.86371: ^ task is: TASK: Include the task 'get_profile_stat.yml' 16142 1727204122.86374: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204122.86379: getting variables 16142 1727204122.86381: in VariableManager get_vars() 16142 1727204122.86447: Calling all_inventory to load vars for managed-node2 16142 1727204122.86450: Calling groups_inventory to load vars for managed-node2 16142 1727204122.86453: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.86466: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.86469: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.86472: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.87671: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000358 16142 1727204122.87676: WORKER PROCESS EXITING 16142 1727204122.88556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.90977: done with get_vars() 16142 1727204122.91014: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.080) 0:00:22.090 ***** 16142 1727204122.91337: entering _queue_task() for managed-node2/include_tasks 16142 1727204122.92042: worker is 1 (out of 1 available) 16142 1727204122.92057: exiting _queue_task() for managed-node2/include_tasks 16142 1727204122.92073: done queuing things up, now waiting for results queue to drain 16142 1727204122.92074: waiting for pending results... 16142 1727204122.93298: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 16142 1727204122.93524: in run() - task 0affcd87-79f5-fddd-f6c7-00000000035c 16142 1727204122.93608: variable 'ansible_search_path' from source: unknown 16142 1727204122.93619: variable 'ansible_search_path' from source: unknown 16142 1727204122.93687: calling self._execute() 16142 1727204122.93991: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204122.94153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204122.94171: variable 'omit' from source: magic vars 16142 1727204122.94849: variable 'ansible_distribution_major_version' from source: facts 16142 1727204122.94871: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204122.94882: _execute() done 16142 1727204122.94891: dumping result to json 16142 1727204122.94901: done dumping result, returning 16142 1727204122.94913: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-fddd-f6c7-00000000035c] 16142 1727204122.94925: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000035c 16142 1727204122.95066: no more pending results, returning what we have 16142 1727204122.95072: in VariableManager get_vars() 16142 1727204122.95143: Calling all_inventory to load vars for managed-node2 16142 1727204122.95146: Calling groups_inventory to load vars for managed-node2 16142 1727204122.95148: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204122.95162: Calling all_plugins_play to load vars for managed-node2 16142 1727204122.95168: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204122.95172: Calling groups_plugins_play to load vars for managed-node2 16142 1727204122.96229: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000035c 16142 1727204122.96236: WORKER PROCESS EXITING 16142 1727204122.97071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204122.99075: done with get_vars() 16142 1727204122.99103: variable 'ansible_search_path' from source: unknown 16142 1727204122.99105: variable 'ansible_search_path' from source: unknown 16142 1727204122.99147: we have included files to process 16142 1727204122.99149: generating all_blocks data 16142 1727204122.99151: done generating all_blocks data 16142 1727204122.99156: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204122.99158: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204122.99160: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204123.00289: done processing included file 16142 1727204123.00292: iterating over new_blocks loaded from include file 16142 1727204123.00293: in VariableManager get_vars() 16142 1727204123.00325: done with get_vars() 16142 1727204123.00327: filtering new block on tags 16142 1727204123.00352: done filtering new block on tags 16142 1727204123.00354: in VariableManager get_vars() 16142 1727204123.00381: done with get_vars() 16142 1727204123.00383: filtering new block on tags 16142 1727204123.00400: done filtering new block on tags 16142 1727204123.00402: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 16142 1727204123.00407: extending task lists for all hosts with included blocks 16142 1727204123.00670: done extending task lists 16142 1727204123.00672: done processing included files 16142 1727204123.00672: results queue empty 16142 1727204123.00673: checking for any_errors_fatal 16142 1727204123.00677: done checking for any_errors_fatal 16142 1727204123.00678: checking for max_fail_percentage 16142 1727204123.00679: done checking for max_fail_percentage 16142 1727204123.00680: checking to see if all hosts have failed and the running result is not ok 16142 1727204123.00681: done checking to see if all hosts have failed 16142 1727204123.00682: getting the remaining hosts for this loop 16142 1727204123.00683: done getting the remaining hosts for this loop 16142 1727204123.00685: getting the next task for host managed-node2 16142 1727204123.00690: done getting next task for host managed-node2 16142 1727204123.00692: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 16142 1727204123.00695: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204123.00697: getting variables 16142 1727204123.00698: in VariableManager get_vars() 16142 1727204123.00716: Calling all_inventory to load vars for managed-node2 16142 1727204123.00718: Calling groups_inventory to load vars for managed-node2 16142 1727204123.00720: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204123.00726: Calling all_plugins_play to load vars for managed-node2 16142 1727204123.00729: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204123.00731: Calling groups_plugins_play to load vars for managed-node2 16142 1727204123.03309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204123.06849: done with get_vars() 16142 1727204123.07140: done getting variables 16142 1727204123.07197: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.158) 0:00:22.249 ***** 16142 1727204123.07234: entering _queue_task() for managed-node2/set_fact 16142 1727204123.07584: worker is 1 (out of 1 available) 16142 1727204123.07597: exiting _queue_task() for managed-node2/set_fact 16142 1727204123.07609: done queuing things up, now waiting for results queue to drain 16142 1727204123.07610: waiting for pending results... 16142 1727204123.08912: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 16142 1727204123.09148: in run() - task 0affcd87-79f5-fddd-f6c7-00000000062c 16142 1727204123.09160: variable 'ansible_search_path' from source: unknown 16142 1727204123.09166: variable 'ansible_search_path' from source: unknown 16142 1727204123.09322: calling self._execute() 16142 1727204123.09537: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.09546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.09554: variable 'omit' from source: magic vars 16142 1727204123.10785: variable 'ansible_distribution_major_version' from source: facts 16142 1727204123.10912: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204123.10919: variable 'omit' from source: magic vars 16142 1727204123.11042: variable 'omit' from source: magic vars 16142 1727204123.11129: variable 'omit' from source: magic vars 16142 1727204123.11177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204123.11215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204123.11379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204123.11400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204123.11410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204123.11444: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204123.11447: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.11450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.11791: Set connection var ansible_timeout to 10 16142 1727204123.11798: Set connection var ansible_connection to ssh 16142 1727204123.11807: Set connection var ansible_shell_type to sh 16142 1727204123.11816: Set connection var ansible_shell_executable to /bin/sh 16142 1727204123.11827: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204123.11838: Set connection var ansible_pipelining to False 16142 1727204123.11869: variable 'ansible_shell_executable' from source: unknown 16142 1727204123.11877: variable 'ansible_connection' from source: unknown 16142 1727204123.11883: variable 'ansible_module_compression' from source: unknown 16142 1727204123.11889: variable 'ansible_shell_type' from source: unknown 16142 1727204123.11894: variable 'ansible_shell_executable' from source: unknown 16142 1727204123.11899: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.11906: variable 'ansible_pipelining' from source: unknown 16142 1727204123.11912: variable 'ansible_timeout' from source: unknown 16142 1727204123.11920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.12073: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204123.12091: variable 'omit' from source: magic vars 16142 1727204123.12101: starting attempt loop 16142 1727204123.12107: running the handler 16142 1727204123.12122: handler run complete 16142 1727204123.12138: attempt loop complete, returning result 16142 1727204123.12145: _execute() done 16142 1727204123.12155: dumping result to json 16142 1727204123.12162: done dumping result, returning 16142 1727204123.12174: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-fddd-f6c7-00000000062c] 16142 1727204123.12183: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000062c ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 16142 1727204123.12331: no more pending results, returning what we have 16142 1727204123.12335: results queue empty 16142 1727204123.12336: checking for any_errors_fatal 16142 1727204123.12338: done checking for any_errors_fatal 16142 1727204123.12339: checking for max_fail_percentage 16142 1727204123.12341: done checking for max_fail_percentage 16142 1727204123.12342: checking to see if all hosts have failed and the running result is not ok 16142 1727204123.12343: done checking to see if all hosts have failed 16142 1727204123.12343: getting the remaining hosts for this loop 16142 1727204123.12345: done getting the remaining hosts for this loop 16142 1727204123.12349: getting the next task for host managed-node2 16142 1727204123.12356: done getting next task for host managed-node2 16142 1727204123.12360: ^ task is: TASK: Stat profile file 16142 1727204123.12367: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204123.12373: getting variables 16142 1727204123.12375: in VariableManager get_vars() 16142 1727204123.12438: Calling all_inventory to load vars for managed-node2 16142 1727204123.12442: Calling groups_inventory to load vars for managed-node2 16142 1727204123.12444: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204123.12456: Calling all_plugins_play to load vars for managed-node2 16142 1727204123.12459: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204123.12462: Calling groups_plugins_play to load vars for managed-node2 16142 1727204123.13487: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000062c 16142 1727204123.13491: WORKER PROCESS EXITING 16142 1727204123.14990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204123.29859: done with get_vars() 16142 1727204123.29890: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.227) 0:00:22.476 ***** 16142 1727204123.29977: entering _queue_task() for managed-node2/stat 16142 1727204123.31019: worker is 1 (out of 1 available) 16142 1727204123.31032: exiting _queue_task() for managed-node2/stat 16142 1727204123.31044: done queuing things up, now waiting for results queue to drain 16142 1727204123.31046: waiting for pending results... 16142 1727204123.31992: running TaskExecutor() for managed-node2/TASK: Stat profile file 16142 1727204123.32221: in run() - task 0affcd87-79f5-fddd-f6c7-00000000062d 16142 1727204123.32349: variable 'ansible_search_path' from source: unknown 16142 1727204123.32357: variable 'ansible_search_path' from source: unknown 16142 1727204123.32393: calling self._execute() 16142 1727204123.32606: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.32610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.32623: variable 'omit' from source: magic vars 16142 1727204123.33492: variable 'ansible_distribution_major_version' from source: facts 16142 1727204123.33505: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204123.33512: variable 'omit' from source: magic vars 16142 1727204123.33570: variable 'omit' from source: magic vars 16142 1727204123.33788: variable 'profile' from source: include params 16142 1727204123.33792: variable 'item' from source: include params 16142 1727204123.33862: variable 'item' from source: include params 16142 1727204123.34000: variable 'omit' from source: magic vars 16142 1727204123.34049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204123.34084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204123.34221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204123.34241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204123.34253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204123.34287: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204123.34291: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.34294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.34521: Set connection var ansible_timeout to 10 16142 1727204123.34643: Set connection var ansible_connection to ssh 16142 1727204123.34650: Set connection var ansible_shell_type to sh 16142 1727204123.34656: Set connection var ansible_shell_executable to /bin/sh 16142 1727204123.34663: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204123.34673: Set connection var ansible_pipelining to False 16142 1727204123.34701: variable 'ansible_shell_executable' from source: unknown 16142 1727204123.34704: variable 'ansible_connection' from source: unknown 16142 1727204123.34708: variable 'ansible_module_compression' from source: unknown 16142 1727204123.34710: variable 'ansible_shell_type' from source: unknown 16142 1727204123.34712: variable 'ansible_shell_executable' from source: unknown 16142 1727204123.34714: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.34719: variable 'ansible_pipelining' from source: unknown 16142 1727204123.34722: variable 'ansible_timeout' from source: unknown 16142 1727204123.34724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.35284: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204123.35296: variable 'omit' from source: magic vars 16142 1727204123.35303: starting attempt loop 16142 1727204123.35307: running the handler 16142 1727204123.35319: _low_level_execute_command(): starting 16142 1727204123.35327: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204123.37209: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.37219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.37373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204123.37377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.37395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.37401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.37488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204123.37495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204123.37513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204123.37681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204123.39236: stdout chunk (state=3): >>>/root <<< 16142 1727204123.39402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204123.39406: stderr chunk (state=3): >>><<< 16142 1727204123.39409: stdout chunk (state=3): >>><<< 16142 1727204123.39436: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204123.39452: _low_level_execute_command(): starting 16142 1727204123.39460: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673 `" && echo ansible-tmp-1727204123.3943775-17972-183041680847673="` echo /root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673 `" ) && sleep 0' 16142 1727204123.41022: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.41028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.41086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204123.41090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.41166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.41170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.41253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204123.41387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204123.41492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204123.43334: stdout chunk (state=3): >>>ansible-tmp-1727204123.3943775-17972-183041680847673=/root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673 <<< 16142 1727204123.43473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204123.43542: stderr chunk (state=3): >>><<< 16142 1727204123.43546: stdout chunk (state=3): >>><<< 16142 1727204123.43772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204123.3943775-17972-183041680847673=/root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204123.43776: variable 'ansible_module_compression' from source: unknown 16142 1727204123.43778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16142 1727204123.43781: variable 'ansible_facts' from source: unknown 16142 1727204123.43809: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673/AnsiballZ_stat.py 16142 1727204123.44504: Sending initial data 16142 1727204123.44508: Sent initial data (153 bytes) 16142 1727204123.48605: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204123.48620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.48634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.48684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.48728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.48780: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204123.48795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.48814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204123.48825: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204123.48835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204123.48847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.48859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.48894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.48907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.48917: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204123.48930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.49115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204123.49132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204123.49147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204123.49326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204123.50980: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 16142 1727204123.51003: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204123.51042: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204123.51086: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpe4pw3fzp /root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673/AnsiballZ_stat.py <<< 16142 1727204123.51121: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204123.52543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204123.52672: stderr chunk (state=3): >>><<< 16142 1727204123.52675: stdout chunk (state=3): >>><<< 16142 1727204123.52678: done transferring module to remote 16142 1727204123.52762: _low_level_execute_command(): starting 16142 1727204123.52774: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673/ /root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673/AnsiballZ_stat.py && sleep 0' 16142 1727204123.54618: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204123.54757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.54777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.54795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.54844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.54884: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204123.54899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.54916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204123.54927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204123.54976: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204123.54989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.55002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.55017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.55027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.55038: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204123.55051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.55131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204123.55303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204123.55318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204123.55516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204123.57291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204123.57295: stdout chunk (state=3): >>><<< 16142 1727204123.57297: stderr chunk (state=3): >>><<< 16142 1727204123.57395: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204123.57399: _low_level_execute_command(): starting 16142 1727204123.57401: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673/AnsiballZ_stat.py && sleep 0' 16142 1727204123.58839: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204123.58961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.58979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.58997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.59042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.59180: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204123.59195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.59213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204123.59224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204123.59234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204123.59245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.59257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.59276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.59287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.59296: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204123.59309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.59389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204123.59519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204123.59539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204123.59680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204123.72733: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16142 1727204123.73734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204123.73818: stderr chunk (state=3): >>><<< 16142 1727204123.73822: stdout chunk (state=3): >>><<< 16142 1727204123.73958: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204123.73963: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204123.73973: _low_level_execute_command(): starting 16142 1727204123.73976: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204123.3943775-17972-183041680847673/ > /dev/null 2>&1 && sleep 0' 16142 1727204123.75438: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.75443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.75598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204123.75603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.75605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.75660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204123.75788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204123.75792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204123.75848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204123.77657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204123.77736: stderr chunk (state=3): >>><<< 16142 1727204123.77739: stdout chunk (state=3): >>><<< 16142 1727204123.77770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204123.77774: handler run complete 16142 1727204123.78071: attempt loop complete, returning result 16142 1727204123.78074: _execute() done 16142 1727204123.78076: dumping result to json 16142 1727204123.78078: done dumping result, returning 16142 1727204123.78080: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-fddd-f6c7-00000000062d] 16142 1727204123.78082: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000062d 16142 1727204123.78161: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000062d 16142 1727204123.78167: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16142 1727204123.78228: no more pending results, returning what we have 16142 1727204123.78232: results queue empty 16142 1727204123.78233: checking for any_errors_fatal 16142 1727204123.78238: done checking for any_errors_fatal 16142 1727204123.78239: checking for max_fail_percentage 16142 1727204123.78240: done checking for max_fail_percentage 16142 1727204123.78241: checking to see if all hosts have failed and the running result is not ok 16142 1727204123.78242: done checking to see if all hosts have failed 16142 1727204123.78243: getting the remaining hosts for this loop 16142 1727204123.78244: done getting the remaining hosts for this loop 16142 1727204123.78248: getting the next task for host managed-node2 16142 1727204123.78254: done getting next task for host managed-node2 16142 1727204123.78257: ^ task is: TASK: Set NM profile exist flag based on the profile files 16142 1727204123.78261: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204123.78267: getting variables 16142 1727204123.78268: in VariableManager get_vars() 16142 1727204123.78318: Calling all_inventory to load vars for managed-node2 16142 1727204123.78321: Calling groups_inventory to load vars for managed-node2 16142 1727204123.78323: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204123.78334: Calling all_plugins_play to load vars for managed-node2 16142 1727204123.78337: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204123.78340: Calling groups_plugins_play to load vars for managed-node2 16142 1727204123.80717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204123.84421: done with get_vars() 16142 1727204123.84450: done getting variables 16142 1727204123.84516: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.545) 0:00:23.022 ***** 16142 1727204123.84553: entering _queue_task() for managed-node2/set_fact 16142 1727204123.85293: worker is 1 (out of 1 available) 16142 1727204123.85305: exiting _queue_task() for managed-node2/set_fact 16142 1727204123.85316: done queuing things up, now waiting for results queue to drain 16142 1727204123.85317: waiting for pending results... 16142 1727204123.86134: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 16142 1727204123.86324: in run() - task 0affcd87-79f5-fddd-f6c7-00000000062e 16142 1727204123.86345: variable 'ansible_search_path' from source: unknown 16142 1727204123.86353: variable 'ansible_search_path' from source: unknown 16142 1727204123.86399: calling self._execute() 16142 1727204123.86508: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.86519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.86538: variable 'omit' from source: magic vars 16142 1727204123.86918: variable 'ansible_distribution_major_version' from source: facts 16142 1727204123.86938: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204123.87069: variable 'profile_stat' from source: set_fact 16142 1727204123.87089: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204123.87097: when evaluation is False, skipping this task 16142 1727204123.87104: _execute() done 16142 1727204123.87111: dumping result to json 16142 1727204123.87119: done dumping result, returning 16142 1727204123.87129: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-fddd-f6c7-00000000062e] 16142 1727204123.87140: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000062e skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204123.87292: no more pending results, returning what we have 16142 1727204123.87297: results queue empty 16142 1727204123.87298: checking for any_errors_fatal 16142 1727204123.87307: done checking for any_errors_fatal 16142 1727204123.87308: checking for max_fail_percentage 16142 1727204123.87310: done checking for max_fail_percentage 16142 1727204123.87311: checking to see if all hosts have failed and the running result is not ok 16142 1727204123.87312: done checking to see if all hosts have failed 16142 1727204123.87313: getting the remaining hosts for this loop 16142 1727204123.87315: done getting the remaining hosts for this loop 16142 1727204123.87319: getting the next task for host managed-node2 16142 1727204123.87326: done getting next task for host managed-node2 16142 1727204123.87330: ^ task is: TASK: Get NM profile info 16142 1727204123.87334: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204123.87341: getting variables 16142 1727204123.87342: in VariableManager get_vars() 16142 1727204123.87406: Calling all_inventory to load vars for managed-node2 16142 1727204123.87409: Calling groups_inventory to load vars for managed-node2 16142 1727204123.87412: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204123.87425: Calling all_plugins_play to load vars for managed-node2 16142 1727204123.87428: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204123.87431: Calling groups_plugins_play to load vars for managed-node2 16142 1727204123.88384: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000062e 16142 1727204123.88388: WORKER PROCESS EXITING 16142 1727204123.89297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204123.90929: done with get_vars() 16142 1727204123.90952: done getting variables 16142 1727204123.91013: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.064) 0:00:23.087 ***** 16142 1727204123.91046: entering _queue_task() for managed-node2/shell 16142 1727204123.91368: worker is 1 (out of 1 available) 16142 1727204123.91382: exiting _queue_task() for managed-node2/shell 16142 1727204123.91394: done queuing things up, now waiting for results queue to drain 16142 1727204123.91395: waiting for pending results... 16142 1727204123.91671: running TaskExecutor() for managed-node2/TASK: Get NM profile info 16142 1727204123.91796: in run() - task 0affcd87-79f5-fddd-f6c7-00000000062f 16142 1727204123.91818: variable 'ansible_search_path' from source: unknown 16142 1727204123.91826: variable 'ansible_search_path' from source: unknown 16142 1727204123.91876: calling self._execute() 16142 1727204123.91981: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.91992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.92007: variable 'omit' from source: magic vars 16142 1727204123.92392: variable 'ansible_distribution_major_version' from source: facts 16142 1727204123.92409: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204123.92421: variable 'omit' from source: magic vars 16142 1727204123.92472: variable 'omit' from source: magic vars 16142 1727204123.92580: variable 'profile' from source: include params 16142 1727204123.92588: variable 'item' from source: include params 16142 1727204123.92657: variable 'item' from source: include params 16142 1727204123.92681: variable 'omit' from source: magic vars 16142 1727204123.92728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204123.92769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204123.92797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204123.92825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204123.92841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204123.92880: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204123.92889: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.92897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.93008: Set connection var ansible_timeout to 10 16142 1727204123.93016: Set connection var ansible_connection to ssh 16142 1727204123.93030: Set connection var ansible_shell_type to sh 16142 1727204123.93041: Set connection var ansible_shell_executable to /bin/sh 16142 1727204123.93051: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204123.93063: Set connection var ansible_pipelining to False 16142 1727204123.93092: variable 'ansible_shell_executable' from source: unknown 16142 1727204123.93099: variable 'ansible_connection' from source: unknown 16142 1727204123.93107: variable 'ansible_module_compression' from source: unknown 16142 1727204123.93113: variable 'ansible_shell_type' from source: unknown 16142 1727204123.93121: variable 'ansible_shell_executable' from source: unknown 16142 1727204123.93128: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204123.93140: variable 'ansible_pipelining' from source: unknown 16142 1727204123.93148: variable 'ansible_timeout' from source: unknown 16142 1727204123.93156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204123.93303: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204123.93320: variable 'omit' from source: magic vars 16142 1727204123.93330: starting attempt loop 16142 1727204123.93337: running the handler 16142 1727204123.93352: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204123.93382: _low_level_execute_command(): starting 16142 1727204123.93395: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204123.94173: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204123.94187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.94201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.94219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.94266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.94278: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204123.94291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.94308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204123.94318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204123.94328: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204123.94343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.94356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.94374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.94387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.94399: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204123.94415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.94499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204123.94523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204123.94540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204123.94683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204123.96193: stdout chunk (state=3): >>>/root <<< 16142 1727204123.96372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204123.96393: stdout chunk (state=3): >>><<< 16142 1727204123.96396: stderr chunk (state=3): >>><<< 16142 1727204123.96511: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204123.96522: _low_level_execute_command(): starting 16142 1727204123.96525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301 `" && echo ansible-tmp-1727204123.9641895-17986-105360267280301="` echo /root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301 `" ) && sleep 0' 16142 1727204123.98493: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204123.98508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.98522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.98541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.98589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.98600: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204123.98613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.98629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204123.98640: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204123.98650: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204123.98660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204123.98685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204123.98700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204123.98805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204123.98818: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204123.98831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204123.98908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204123.98930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204123.98945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204123.99021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204124.00865: stdout chunk (state=3): >>>ansible-tmp-1727204123.9641895-17986-105360267280301=/root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301 <<< 16142 1727204124.01063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204124.01068: stdout chunk (state=3): >>><<< 16142 1727204124.01070: stderr chunk (state=3): >>><<< 16142 1727204124.01271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204123.9641895-17986-105360267280301=/root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204124.01275: variable 'ansible_module_compression' from source: unknown 16142 1727204124.01277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204124.01279: variable 'ansible_facts' from source: unknown 16142 1727204124.01295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301/AnsiballZ_command.py 16142 1727204124.01951: Sending initial data 16142 1727204124.01954: Sent initial data (156 bytes) 16142 1727204124.04453: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204124.04470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204124.04489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204124.04509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204124.04560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204124.04575: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204124.04590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204124.04608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204124.04619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204124.04629: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204124.04641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204124.04658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204124.04677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204124.04689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204124.04700: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204124.04715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204124.04907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204124.04932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204124.04950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204124.05082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204124.06745: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204124.06788: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204124.06826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpyvaj2ilj /root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301/AnsiballZ_command.py <<< 16142 1727204124.06867: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204124.08082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204124.08173: stderr chunk (state=3): >>><<< 16142 1727204124.08177: stdout chunk (state=3): >>><<< 16142 1727204124.08179: done transferring module to remote 16142 1727204124.08181: _low_level_execute_command(): starting 16142 1727204124.08184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301/ /root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301/AnsiballZ_command.py && sleep 0' 16142 1727204124.09827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204124.09841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204124.09854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204124.09874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204124.10034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204124.10046: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204124.10059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204124.10079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204124.10089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204124.10099: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204124.10109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204124.10123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204124.10138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204124.10151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204124.10162: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204124.10178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204124.10257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204124.10361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204124.10381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204124.10460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204124.12185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204124.12278: stderr chunk (state=3): >>><<< 16142 1727204124.12282: stdout chunk (state=3): >>><<< 16142 1727204124.12372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204124.12375: _low_level_execute_command(): starting 16142 1727204124.12378: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301/AnsiballZ_command.py && sleep 0' 16142 1727204124.14019: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204124.14036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204124.14051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204124.14073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204124.14120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204124.14180: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204124.14196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204124.14214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204124.14226: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204124.14245: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204124.14259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204124.14358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204124.14379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204124.14393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204124.14405: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204124.14421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204124.14503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204124.14578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204124.14594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204124.14792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204124.30129: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:55:24.277411", "end": "2024-09-24 14:55:24.300583", "delta": "0:00:00.023172", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204124.31290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204124.31428: stderr chunk (state=3): >>><<< 16142 1727204124.31432: stdout chunk (state=3): >>><<< 16142 1727204124.31594: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:55:24.277411", "end": "2024-09-24 14:55:24.300583", "delta": "0:00:00.023172", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204124.31605: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204124.31609: _low_level_execute_command(): starting 16142 1727204124.31612: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204123.9641895-17986-105360267280301/ > /dev/null 2>&1 && sleep 0' 16142 1727204124.33119: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204124.33783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204124.33795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204124.33808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204124.33851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204124.33858: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204124.33869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204124.33883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204124.33891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204124.33898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204124.33907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204124.33917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204124.33927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204124.33937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204124.33946: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204124.33955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204124.34030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204124.34051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204124.34103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204124.34203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204124.35984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204124.36050: stderr chunk (state=3): >>><<< 16142 1727204124.36053: stdout chunk (state=3): >>><<< 16142 1727204124.36073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204124.36082: handler run complete 16142 1727204124.36107: Evaluated conditional (False): False 16142 1727204124.36117: attempt loop complete, returning result 16142 1727204124.36120: _execute() done 16142 1727204124.36122: dumping result to json 16142 1727204124.36127: done dumping result, returning 16142 1727204124.36139: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-fddd-f6c7-00000000062f] 16142 1727204124.36145: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000062f 16142 1727204124.36252: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000062f 16142 1727204124.36255: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.023172", "end": "2024-09-24 14:55:24.300583", "rc": 0, "start": "2024-09-24 14:55:24.277411" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 16142 1727204124.36323: no more pending results, returning what we have 16142 1727204124.36327: results queue empty 16142 1727204124.36327: checking for any_errors_fatal 16142 1727204124.36334: done checking for any_errors_fatal 16142 1727204124.36335: checking for max_fail_percentage 16142 1727204124.36336: done checking for max_fail_percentage 16142 1727204124.36337: checking to see if all hosts have failed and the running result is not ok 16142 1727204124.36338: done checking to see if all hosts have failed 16142 1727204124.36339: getting the remaining hosts for this loop 16142 1727204124.36340: done getting the remaining hosts for this loop 16142 1727204124.36344: getting the next task for host managed-node2 16142 1727204124.36351: done getting next task for host managed-node2 16142 1727204124.36353: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16142 1727204124.36357: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204124.36361: getting variables 16142 1727204124.36363: in VariableManager get_vars() 16142 1727204124.36418: Calling all_inventory to load vars for managed-node2 16142 1727204124.36421: Calling groups_inventory to load vars for managed-node2 16142 1727204124.36423: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204124.36433: Calling all_plugins_play to load vars for managed-node2 16142 1727204124.36435: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204124.36437: Calling groups_plugins_play to load vars for managed-node2 16142 1727204124.38997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204124.42731: done with get_vars() 16142 1727204124.42881: done getting variables 16142 1727204124.42942: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.520) 0:00:23.607 ***** 16142 1727204124.43094: entering _queue_task() for managed-node2/set_fact 16142 1727204124.43774: worker is 1 (out of 1 available) 16142 1727204124.43786: exiting _queue_task() for managed-node2/set_fact 16142 1727204124.43799: done queuing things up, now waiting for results queue to drain 16142 1727204124.43800: waiting for pending results... 16142 1727204124.44558: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16142 1727204124.44651: in run() - task 0affcd87-79f5-fddd-f6c7-000000000630 16142 1727204124.44667: variable 'ansible_search_path' from source: unknown 16142 1727204124.44974: variable 'ansible_search_path' from source: unknown 16142 1727204124.45012: calling self._execute() 16142 1727204124.45112: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.45116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.45130: variable 'omit' from source: magic vars 16142 1727204124.45912: variable 'ansible_distribution_major_version' from source: facts 16142 1727204124.45923: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204124.46259: variable 'nm_profile_exists' from source: set_fact 16142 1727204124.46277: Evaluated conditional (nm_profile_exists.rc == 0): True 16142 1727204124.46283: variable 'omit' from source: magic vars 16142 1727204124.46327: variable 'omit' from source: magic vars 16142 1727204124.46360: variable 'omit' from source: magic vars 16142 1727204124.46607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204124.46645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204124.46668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204124.46686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204124.46698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204124.46728: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204124.46732: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.46735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.47043: Set connection var ansible_timeout to 10 16142 1727204124.47046: Set connection var ansible_connection to ssh 16142 1727204124.47051: Set connection var ansible_shell_type to sh 16142 1727204124.47057: Set connection var ansible_shell_executable to /bin/sh 16142 1727204124.47063: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204124.47073: Set connection var ansible_pipelining to False 16142 1727204124.47098: variable 'ansible_shell_executable' from source: unknown 16142 1727204124.47101: variable 'ansible_connection' from source: unknown 16142 1727204124.47106: variable 'ansible_module_compression' from source: unknown 16142 1727204124.47108: variable 'ansible_shell_type' from source: unknown 16142 1727204124.47110: variable 'ansible_shell_executable' from source: unknown 16142 1727204124.47113: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.47116: variable 'ansible_pipelining' from source: unknown 16142 1727204124.47118: variable 'ansible_timeout' from source: unknown 16142 1727204124.47121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.47777: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204124.47789: variable 'omit' from source: magic vars 16142 1727204124.47792: starting attempt loop 16142 1727204124.47795: running the handler 16142 1727204124.47809: handler run complete 16142 1727204124.47820: attempt loop complete, returning result 16142 1727204124.47823: _execute() done 16142 1727204124.47825: dumping result to json 16142 1727204124.47828: done dumping result, returning 16142 1727204124.47834: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-fddd-f6c7-000000000630] 16142 1727204124.47844: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000630 16142 1727204124.47940: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000630 16142 1727204124.47942: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 16142 1727204124.47999: no more pending results, returning what we have 16142 1727204124.48003: results queue empty 16142 1727204124.48003: checking for any_errors_fatal 16142 1727204124.48014: done checking for any_errors_fatal 16142 1727204124.48015: checking for max_fail_percentage 16142 1727204124.48016: done checking for max_fail_percentage 16142 1727204124.48017: checking to see if all hosts have failed and the running result is not ok 16142 1727204124.48018: done checking to see if all hosts have failed 16142 1727204124.48018: getting the remaining hosts for this loop 16142 1727204124.48020: done getting the remaining hosts for this loop 16142 1727204124.48023: getting the next task for host managed-node2 16142 1727204124.48038: done getting next task for host managed-node2 16142 1727204124.48061: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 16142 1727204124.48066: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204124.48070: getting variables 16142 1727204124.48072: in VariableManager get_vars() 16142 1727204124.48128: Calling all_inventory to load vars for managed-node2 16142 1727204124.48131: Calling groups_inventory to load vars for managed-node2 16142 1727204124.48137: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204124.48147: Calling all_plugins_play to load vars for managed-node2 16142 1727204124.48149: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204124.48151: Calling groups_plugins_play to load vars for managed-node2 16142 1727204124.51499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204124.54351: done with get_vars() 16142 1727204124.54382: done getting variables 16142 1727204124.54572: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204124.54813: variable 'profile' from source: include params 16142 1727204124.54817: variable 'item' from source: include params 16142 1727204124.55001: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.119) 0:00:23.727 ***** 16142 1727204124.55042: entering _queue_task() for managed-node2/command 16142 1727204124.55713: worker is 1 (out of 1 available) 16142 1727204124.55739: exiting _queue_task() for managed-node2/command 16142 1727204124.55750: done queuing things up, now waiting for results queue to drain 16142 1727204124.55751: waiting for pending results... 16142 1727204124.56509: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 16142 1727204124.56767: in run() - task 0affcd87-79f5-fddd-f6c7-000000000632 16142 1727204124.56787: variable 'ansible_search_path' from source: unknown 16142 1727204124.56820: variable 'ansible_search_path' from source: unknown 16142 1727204124.56959: calling self._execute() 16142 1727204124.57240: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.57257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.57323: variable 'omit' from source: magic vars 16142 1727204124.58090: variable 'ansible_distribution_major_version' from source: facts 16142 1727204124.58109: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204124.58256: variable 'profile_stat' from source: set_fact 16142 1727204124.58284: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204124.58295: when evaluation is False, skipping this task 16142 1727204124.58302: _execute() done 16142 1727204124.58309: dumping result to json 16142 1727204124.58317: done dumping result, returning 16142 1727204124.58327: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affcd87-79f5-fddd-f6c7-000000000632] 16142 1727204124.58343: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000632 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204124.58554: no more pending results, returning what we have 16142 1727204124.58558: results queue empty 16142 1727204124.58560: checking for any_errors_fatal 16142 1727204124.58570: done checking for any_errors_fatal 16142 1727204124.58571: checking for max_fail_percentage 16142 1727204124.58574: done checking for max_fail_percentage 16142 1727204124.58575: checking to see if all hosts have failed and the running result is not ok 16142 1727204124.58576: done checking to see if all hosts have failed 16142 1727204124.58577: getting the remaining hosts for this loop 16142 1727204124.58578: done getting the remaining hosts for this loop 16142 1727204124.58582: getting the next task for host managed-node2 16142 1727204124.58590: done getting next task for host managed-node2 16142 1727204124.58593: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 16142 1727204124.58598: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204124.58606: getting variables 16142 1727204124.58608: in VariableManager get_vars() 16142 1727204124.58675: Calling all_inventory to load vars for managed-node2 16142 1727204124.58679: Calling groups_inventory to load vars for managed-node2 16142 1727204124.58682: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204124.58695: Calling all_plugins_play to load vars for managed-node2 16142 1727204124.58699: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204124.58702: Calling groups_plugins_play to load vars for managed-node2 16142 1727204124.59785: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000632 16142 1727204124.59789: WORKER PROCESS EXITING 16142 1727204124.61940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204124.64339: done with get_vars() 16142 1727204124.64374: done getting variables 16142 1727204124.64445: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204124.64696: variable 'profile' from source: include params 16142 1727204124.64701: variable 'item' from source: include params 16142 1727204124.64773: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.097) 0:00:23.824 ***** 16142 1727204124.64920: entering _queue_task() for managed-node2/set_fact 16142 1727204124.65594: worker is 1 (out of 1 available) 16142 1727204124.65608: exiting _queue_task() for managed-node2/set_fact 16142 1727204124.65620: done queuing things up, now waiting for results queue to drain 16142 1727204124.65621: waiting for pending results... 16142 1727204124.65957: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 16142 1727204124.66097: in run() - task 0affcd87-79f5-fddd-f6c7-000000000633 16142 1727204124.66125: variable 'ansible_search_path' from source: unknown 16142 1727204124.66140: variable 'ansible_search_path' from source: unknown 16142 1727204124.66188: calling self._execute() 16142 1727204124.66311: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.66323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.66346: variable 'omit' from source: magic vars 16142 1727204124.66786: variable 'ansible_distribution_major_version' from source: facts 16142 1727204124.66810: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204124.66973: variable 'profile_stat' from source: set_fact 16142 1727204124.66998: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204124.67014: when evaluation is False, skipping this task 16142 1727204124.67022: _execute() done 16142 1727204124.67029: dumping result to json 16142 1727204124.67040: done dumping result, returning 16142 1727204124.67055: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affcd87-79f5-fddd-f6c7-000000000633] 16142 1727204124.67070: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000633 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204124.67260: no more pending results, returning what we have 16142 1727204124.67266: results queue empty 16142 1727204124.67268: checking for any_errors_fatal 16142 1727204124.67274: done checking for any_errors_fatal 16142 1727204124.67275: checking for max_fail_percentage 16142 1727204124.67277: done checking for max_fail_percentage 16142 1727204124.67279: checking to see if all hosts have failed and the running result is not ok 16142 1727204124.67280: done checking to see if all hosts have failed 16142 1727204124.67280: getting the remaining hosts for this loop 16142 1727204124.67282: done getting the remaining hosts for this loop 16142 1727204124.67286: getting the next task for host managed-node2 16142 1727204124.67294: done getting next task for host managed-node2 16142 1727204124.67297: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 16142 1727204124.67302: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204124.67307: getting variables 16142 1727204124.67309: in VariableManager get_vars() 16142 1727204124.67374: Calling all_inventory to load vars for managed-node2 16142 1727204124.67377: Calling groups_inventory to load vars for managed-node2 16142 1727204124.67380: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204124.67393: Calling all_plugins_play to load vars for managed-node2 16142 1727204124.67397: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204124.67400: Calling groups_plugins_play to load vars for managed-node2 16142 1727204124.68428: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000633 16142 1727204124.68434: WORKER PROCESS EXITING 16142 1727204124.69609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204124.72635: done with get_vars() 16142 1727204124.72670: done getting variables 16142 1727204124.72743: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204124.72875: variable 'profile' from source: include params 16142 1727204124.72879: variable 'item' from source: include params 16142 1727204124.72953: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.080) 0:00:23.906 ***** 16142 1727204124.72988: entering _queue_task() for managed-node2/command 16142 1727204124.73801: worker is 1 (out of 1 available) 16142 1727204124.73815: exiting _queue_task() for managed-node2/command 16142 1727204124.73826: done queuing things up, now waiting for results queue to drain 16142 1727204124.73827: waiting for pending results... 16142 1727204124.74524: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 16142 1727204124.74672: in run() - task 0affcd87-79f5-fddd-f6c7-000000000634 16142 1727204124.74692: variable 'ansible_search_path' from source: unknown 16142 1727204124.74699: variable 'ansible_search_path' from source: unknown 16142 1727204124.74742: calling self._execute() 16142 1727204124.74851: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.74875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.74889: variable 'omit' from source: magic vars 16142 1727204124.75305: variable 'ansible_distribution_major_version' from source: facts 16142 1727204124.75328: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204124.75475: variable 'profile_stat' from source: set_fact 16142 1727204124.75495: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204124.75503: when evaluation is False, skipping this task 16142 1727204124.75512: _execute() done 16142 1727204124.75533: dumping result to json 16142 1727204124.75541: done dumping result, returning 16142 1727204124.75551: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affcd87-79f5-fddd-f6c7-000000000634] 16142 1727204124.75562: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000634 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204124.75728: no more pending results, returning what we have 16142 1727204124.75733: results queue empty 16142 1727204124.75734: checking for any_errors_fatal 16142 1727204124.75742: done checking for any_errors_fatal 16142 1727204124.75743: checking for max_fail_percentage 16142 1727204124.75745: done checking for max_fail_percentage 16142 1727204124.75745: checking to see if all hosts have failed and the running result is not ok 16142 1727204124.75746: done checking to see if all hosts have failed 16142 1727204124.75747: getting the remaining hosts for this loop 16142 1727204124.75748: done getting the remaining hosts for this loop 16142 1727204124.75752: getting the next task for host managed-node2 16142 1727204124.75760: done getting next task for host managed-node2 16142 1727204124.75765: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 16142 1727204124.75770: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204124.75776: getting variables 16142 1727204124.75779: in VariableManager get_vars() 16142 1727204124.75838: Calling all_inventory to load vars for managed-node2 16142 1727204124.75841: Calling groups_inventory to load vars for managed-node2 16142 1727204124.75844: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204124.75856: Calling all_plugins_play to load vars for managed-node2 16142 1727204124.75859: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204124.75862: Calling groups_plugins_play to load vars for managed-node2 16142 1727204124.76971: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000634 16142 1727204124.76975: WORKER PROCESS EXITING 16142 1727204124.78740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204124.80635: done with get_vars() 16142 1727204124.80658: done getting variables 16142 1727204124.80850: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204124.81094: variable 'profile' from source: include params 16142 1727204124.81099: variable 'item' from source: include params 16142 1727204124.81288: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.083) 0:00:23.990 ***** 16142 1727204124.81319: entering _queue_task() for managed-node2/set_fact 16142 1727204124.82139: worker is 1 (out of 1 available) 16142 1727204124.82155: exiting _queue_task() for managed-node2/set_fact 16142 1727204124.82170: done queuing things up, now waiting for results queue to drain 16142 1727204124.82172: waiting for pending results... 16142 1727204124.82518: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 16142 1727204124.82652: in run() - task 0affcd87-79f5-fddd-f6c7-000000000635 16142 1727204124.82738: variable 'ansible_search_path' from source: unknown 16142 1727204124.82745: variable 'ansible_search_path' from source: unknown 16142 1727204124.82842: calling self._execute() 16142 1727204124.83038: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.83109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.83131: variable 'omit' from source: magic vars 16142 1727204124.84016: variable 'ansible_distribution_major_version' from source: facts 16142 1727204124.84037: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204124.84290: variable 'profile_stat' from source: set_fact 16142 1727204124.84373: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204124.84381: when evaluation is False, skipping this task 16142 1727204124.84388: _execute() done 16142 1727204124.84395: dumping result to json 16142 1727204124.84404: done dumping result, returning 16142 1727204124.84420: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affcd87-79f5-fddd-f6c7-000000000635] 16142 1727204124.84435: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000635 16142 1727204124.84554: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000635 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204124.84608: no more pending results, returning what we have 16142 1727204124.84613: results queue empty 16142 1727204124.84614: checking for any_errors_fatal 16142 1727204124.84621: done checking for any_errors_fatal 16142 1727204124.84622: checking for max_fail_percentage 16142 1727204124.84624: done checking for max_fail_percentage 16142 1727204124.84625: checking to see if all hosts have failed and the running result is not ok 16142 1727204124.84626: done checking to see if all hosts have failed 16142 1727204124.84627: getting the remaining hosts for this loop 16142 1727204124.84628: done getting the remaining hosts for this loop 16142 1727204124.84633: getting the next task for host managed-node2 16142 1727204124.84642: done getting next task for host managed-node2 16142 1727204124.84645: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 16142 1727204124.84651: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204124.84656: getting variables 16142 1727204124.84659: in VariableManager get_vars() 16142 1727204124.84722: Calling all_inventory to load vars for managed-node2 16142 1727204124.84726: Calling groups_inventory to load vars for managed-node2 16142 1727204124.84728: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204124.84741: Calling all_plugins_play to load vars for managed-node2 16142 1727204124.84744: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204124.84747: Calling groups_plugins_play to load vars for managed-node2 16142 1727204124.85724: WORKER PROCESS EXITING 16142 1727204124.86841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204124.88843: done with get_vars() 16142 1727204124.88875: done getting variables 16142 1727204124.88947: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204124.89083: variable 'profile' from source: include params 16142 1727204124.89087: variable 'item' from source: include params 16142 1727204124.89161: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.078) 0:00:24.068 ***** 16142 1727204124.89195: entering _queue_task() for managed-node2/assert 16142 1727204124.89587: worker is 1 (out of 1 available) 16142 1727204124.89599: exiting _queue_task() for managed-node2/assert 16142 1727204124.89613: done queuing things up, now waiting for results queue to drain 16142 1727204124.89614: waiting for pending results... 16142 1727204124.89928: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.0' 16142 1727204124.90048: in run() - task 0affcd87-79f5-fddd-f6c7-00000000035d 16142 1727204124.90074: variable 'ansible_search_path' from source: unknown 16142 1727204124.90082: variable 'ansible_search_path' from source: unknown 16142 1727204124.90133: calling self._execute() 16142 1727204124.90249: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.90262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.90280: variable 'omit' from source: magic vars 16142 1727204124.90719: variable 'ansible_distribution_major_version' from source: facts 16142 1727204124.90739: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204124.90763: variable 'omit' from source: magic vars 16142 1727204124.90819: variable 'omit' from source: magic vars 16142 1727204124.90942: variable 'profile' from source: include params 16142 1727204124.90951: variable 'item' from source: include params 16142 1727204124.91030: variable 'item' from source: include params 16142 1727204124.91056: variable 'omit' from source: magic vars 16142 1727204124.91115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204124.91160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204124.91198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204124.91226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204124.91243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204124.91282: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204124.91295: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.91309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.91446: Set connection var ansible_timeout to 10 16142 1727204124.91454: Set connection var ansible_connection to ssh 16142 1727204124.91466: Set connection var ansible_shell_type to sh 16142 1727204124.91478: Set connection var ansible_shell_executable to /bin/sh 16142 1727204124.91486: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204124.91497: Set connection var ansible_pipelining to False 16142 1727204124.91535: variable 'ansible_shell_executable' from source: unknown 16142 1727204124.91548: variable 'ansible_connection' from source: unknown 16142 1727204124.91557: variable 'ansible_module_compression' from source: unknown 16142 1727204124.91563: variable 'ansible_shell_type' from source: unknown 16142 1727204124.91572: variable 'ansible_shell_executable' from source: unknown 16142 1727204124.91587: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.91596: variable 'ansible_pipelining' from source: unknown 16142 1727204124.91605: variable 'ansible_timeout' from source: unknown 16142 1727204124.91625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.91792: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204124.91812: variable 'omit' from source: magic vars 16142 1727204124.91822: starting attempt loop 16142 1727204124.91830: running the handler 16142 1727204124.91968: variable 'lsr_net_profile_exists' from source: set_fact 16142 1727204124.91981: Evaluated conditional (lsr_net_profile_exists): True 16142 1727204124.91991: handler run complete 16142 1727204124.92013: attempt loop complete, returning result 16142 1727204124.92023: _execute() done 16142 1727204124.92030: dumping result to json 16142 1727204124.92037: done dumping result, returning 16142 1727204124.92050: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.0' [0affcd87-79f5-fddd-f6c7-00000000035d] 16142 1727204124.92071: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000035d ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204124.92223: no more pending results, returning what we have 16142 1727204124.92227: results queue empty 16142 1727204124.92228: checking for any_errors_fatal 16142 1727204124.92233: done checking for any_errors_fatal 16142 1727204124.92234: checking for max_fail_percentage 16142 1727204124.92236: done checking for max_fail_percentage 16142 1727204124.92237: checking to see if all hosts have failed and the running result is not ok 16142 1727204124.92237: done checking to see if all hosts have failed 16142 1727204124.92238: getting the remaining hosts for this loop 16142 1727204124.92240: done getting the remaining hosts for this loop 16142 1727204124.92243: getting the next task for host managed-node2 16142 1727204124.92251: done getting next task for host managed-node2 16142 1727204124.92254: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 16142 1727204124.92256: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204124.92260: getting variables 16142 1727204124.92262: in VariableManager get_vars() 16142 1727204124.92323: Calling all_inventory to load vars for managed-node2 16142 1727204124.92326: Calling groups_inventory to load vars for managed-node2 16142 1727204124.92328: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204124.92339: Calling all_plugins_play to load vars for managed-node2 16142 1727204124.92342: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204124.92345: Calling groups_plugins_play to load vars for managed-node2 16142 1727204124.93978: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000035d 16142 1727204124.93982: WORKER PROCESS EXITING 16142 1727204124.94585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204124.97102: done with get_vars() 16142 1727204124.97138: done getting variables 16142 1727204124.97203: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204124.97324: variable 'profile' from source: include params 16142 1727204124.97328: variable 'item' from source: include params 16142 1727204124.97389: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.082) 0:00:24.151 ***** 16142 1727204124.97426: entering _queue_task() for managed-node2/assert 16142 1727204124.97738: worker is 1 (out of 1 available) 16142 1727204124.97751: exiting _queue_task() for managed-node2/assert 16142 1727204124.97761: done queuing things up, now waiting for results queue to drain 16142 1727204124.97762: waiting for pending results... 16142 1727204124.99176: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' 16142 1727204124.99313: in run() - task 0affcd87-79f5-fddd-f6c7-00000000035e 16142 1727204124.99340: variable 'ansible_search_path' from source: unknown 16142 1727204124.99344: variable 'ansible_search_path' from source: unknown 16142 1727204124.99383: calling self._execute() 16142 1727204124.99534: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204124.99548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204124.99559: variable 'omit' from source: magic vars 16142 1727204125.00142: variable 'ansible_distribution_major_version' from source: facts 16142 1727204125.00157: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204125.00170: variable 'omit' from source: magic vars 16142 1727204125.00228: variable 'omit' from source: magic vars 16142 1727204125.00402: variable 'profile' from source: include params 16142 1727204125.00406: variable 'item' from source: include params 16142 1727204125.00527: variable 'item' from source: include params 16142 1727204125.00555: variable 'omit' from source: magic vars 16142 1727204125.00600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204125.00654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204125.00677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204125.00695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.00707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.00747: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204125.00751: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.00753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.00906: Set connection var ansible_timeout to 10 16142 1727204125.00912: Set connection var ansible_connection to ssh 16142 1727204125.00931: Set connection var ansible_shell_type to sh 16142 1727204125.00997: Set connection var ansible_shell_executable to /bin/sh 16142 1727204125.01003: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204125.01011: Set connection var ansible_pipelining to False 16142 1727204125.01034: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.01040: variable 'ansible_connection' from source: unknown 16142 1727204125.01043: variable 'ansible_module_compression' from source: unknown 16142 1727204125.01046: variable 'ansible_shell_type' from source: unknown 16142 1727204125.01048: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.01050: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.01056: variable 'ansible_pipelining' from source: unknown 16142 1727204125.01059: variable 'ansible_timeout' from source: unknown 16142 1727204125.01067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.01591: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204125.01602: variable 'omit' from source: magic vars 16142 1727204125.01609: starting attempt loop 16142 1727204125.01612: running the handler 16142 1727204125.01744: variable 'lsr_net_profile_ansible_managed' from source: set_fact 16142 1727204125.01748: Evaluated conditional (lsr_net_profile_ansible_managed): True 16142 1727204125.01765: handler run complete 16142 1727204125.01781: attempt loop complete, returning result 16142 1727204125.01784: _execute() done 16142 1727204125.01787: dumping result to json 16142 1727204125.01789: done dumping result, returning 16142 1727204125.01796: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affcd87-79f5-fddd-f6c7-00000000035e] 16142 1727204125.01802: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000035e 16142 1727204125.01894: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000035e 16142 1727204125.01897: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204125.01944: no more pending results, returning what we have 16142 1727204125.01948: results queue empty 16142 1727204125.01949: checking for any_errors_fatal 16142 1727204125.01954: done checking for any_errors_fatal 16142 1727204125.01954: checking for max_fail_percentage 16142 1727204125.01956: done checking for max_fail_percentage 16142 1727204125.01957: checking to see if all hosts have failed and the running result is not ok 16142 1727204125.01957: done checking to see if all hosts have failed 16142 1727204125.01958: getting the remaining hosts for this loop 16142 1727204125.01959: done getting the remaining hosts for this loop 16142 1727204125.01963: getting the next task for host managed-node2 16142 1727204125.01975: done getting next task for host managed-node2 16142 1727204125.01978: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 16142 1727204125.01981: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204125.01985: getting variables 16142 1727204125.01987: in VariableManager get_vars() 16142 1727204125.02047: Calling all_inventory to load vars for managed-node2 16142 1727204125.02050: Calling groups_inventory to load vars for managed-node2 16142 1727204125.02052: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204125.02062: Calling all_plugins_play to load vars for managed-node2 16142 1727204125.02066: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204125.02069: Calling groups_plugins_play to load vars for managed-node2 16142 1727204125.03710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204125.06152: done with get_vars() 16142 1727204125.06192: done getting variables 16142 1727204125.06259: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204125.06502: variable 'profile' from source: include params 16142 1727204125.06505: variable 'item' from source: include params 16142 1727204125.06608: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.092) 0:00:24.243 ***** 16142 1727204125.06646: entering _queue_task() for managed-node2/assert 16142 1727204125.06894: worker is 1 (out of 1 available) 16142 1727204125.06907: exiting _queue_task() for managed-node2/assert 16142 1727204125.06918: done queuing things up, now waiting for results queue to drain 16142 1727204125.06920: waiting for pending results... 16142 1727204125.07109: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.0 16142 1727204125.07206: in run() - task 0affcd87-79f5-fddd-f6c7-00000000035f 16142 1727204125.07216: variable 'ansible_search_path' from source: unknown 16142 1727204125.07223: variable 'ansible_search_path' from source: unknown 16142 1727204125.07297: calling self._execute() 16142 1727204125.07401: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.07412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.07425: variable 'omit' from source: magic vars 16142 1727204125.07842: variable 'ansible_distribution_major_version' from source: facts 16142 1727204125.07859: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204125.07874: variable 'omit' from source: magic vars 16142 1727204125.07921: variable 'omit' from source: magic vars 16142 1727204125.08033: variable 'profile' from source: include params 16142 1727204125.08052: variable 'item' from source: include params 16142 1727204125.08119: variable 'item' from source: include params 16142 1727204125.08145: variable 'omit' from source: magic vars 16142 1727204125.08199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204125.08241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204125.08278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204125.08302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.08321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.08355: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204125.08365: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.08381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.08496: Set connection var ansible_timeout to 10 16142 1727204125.08504: Set connection var ansible_connection to ssh 16142 1727204125.08514: Set connection var ansible_shell_type to sh 16142 1727204125.08523: Set connection var ansible_shell_executable to /bin/sh 16142 1727204125.08531: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204125.08542: Set connection var ansible_pipelining to False 16142 1727204125.08570: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.08577: variable 'ansible_connection' from source: unknown 16142 1727204125.08585: variable 'ansible_module_compression' from source: unknown 16142 1727204125.08596: variable 'ansible_shell_type' from source: unknown 16142 1727204125.08608: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.08615: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.08622: variable 'ansible_pipelining' from source: unknown 16142 1727204125.08629: variable 'ansible_timeout' from source: unknown 16142 1727204125.08636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.08792: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204125.08815: variable 'omit' from source: magic vars 16142 1727204125.08831: starting attempt loop 16142 1727204125.08838: running the handler 16142 1727204125.08956: variable 'lsr_net_profile_fingerprint' from source: set_fact 16142 1727204125.08969: Evaluated conditional (lsr_net_profile_fingerprint): True 16142 1727204125.08981: handler run complete 16142 1727204125.08999: attempt loop complete, returning result 16142 1727204125.09005: _execute() done 16142 1727204125.09010: dumping result to json 16142 1727204125.09017: done dumping result, returning 16142 1727204125.09046: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.0 [0affcd87-79f5-fddd-f6c7-00000000035f] 16142 1727204125.09061: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000035f ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204125.09232: no more pending results, returning what we have 16142 1727204125.09236: results queue empty 16142 1727204125.09238: checking for any_errors_fatal 16142 1727204125.09248: done checking for any_errors_fatal 16142 1727204125.09249: checking for max_fail_percentage 16142 1727204125.09251: done checking for max_fail_percentage 16142 1727204125.09252: checking to see if all hosts have failed and the running result is not ok 16142 1727204125.09253: done checking to see if all hosts have failed 16142 1727204125.09254: getting the remaining hosts for this loop 16142 1727204125.09255: done getting the remaining hosts for this loop 16142 1727204125.09259: getting the next task for host managed-node2 16142 1727204125.09392: done getting next task for host managed-node2 16142 1727204125.09396: ^ task is: TASK: Include the task 'get_profile_stat.yml' 16142 1727204125.09398: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204125.09404: getting variables 16142 1727204125.09406: in VariableManager get_vars() 16142 1727204125.09460: Calling all_inventory to load vars for managed-node2 16142 1727204125.09463: Calling groups_inventory to load vars for managed-node2 16142 1727204125.09471: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204125.09481: Calling all_plugins_play to load vars for managed-node2 16142 1727204125.09484: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204125.09486: Calling groups_plugins_play to load vars for managed-node2 16142 1727204125.10094: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000035f 16142 1727204125.10102: WORKER PROCESS EXITING 16142 1727204125.11247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204125.13389: done with get_vars() 16142 1727204125.13498: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.071) 0:00:24.314 ***** 16142 1727204125.13762: entering _queue_task() for managed-node2/include_tasks 16142 1727204125.14418: worker is 1 (out of 1 available) 16142 1727204125.14430: exiting _queue_task() for managed-node2/include_tasks 16142 1727204125.14444: done queuing things up, now waiting for results queue to drain 16142 1727204125.14445: waiting for pending results... 16142 1727204125.15157: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 16142 1727204125.15280: in run() - task 0affcd87-79f5-fddd-f6c7-000000000363 16142 1727204125.15296: variable 'ansible_search_path' from source: unknown 16142 1727204125.15308: variable 'ansible_search_path' from source: unknown 16142 1727204125.15340: calling self._execute() 16142 1727204125.15458: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.15466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.15496: variable 'omit' from source: magic vars 16142 1727204125.15954: variable 'ansible_distribution_major_version' from source: facts 16142 1727204125.15976: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204125.15980: _execute() done 16142 1727204125.15982: dumping result to json 16142 1727204125.15985: done dumping result, returning 16142 1727204125.15988: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-fddd-f6c7-000000000363] 16142 1727204125.15997: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000363 16142 1727204125.16135: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000363 16142 1727204125.16138: WORKER PROCESS EXITING 16142 1727204125.16186: no more pending results, returning what we have 16142 1727204125.16192: in VariableManager get_vars() 16142 1727204125.16252: Calling all_inventory to load vars for managed-node2 16142 1727204125.16255: Calling groups_inventory to load vars for managed-node2 16142 1727204125.16257: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204125.16268: Calling all_plugins_play to load vars for managed-node2 16142 1727204125.16270: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204125.16273: Calling groups_plugins_play to load vars for managed-node2 16142 1727204125.18856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204125.20478: done with get_vars() 16142 1727204125.20494: variable 'ansible_search_path' from source: unknown 16142 1727204125.20496: variable 'ansible_search_path' from source: unknown 16142 1727204125.20523: we have included files to process 16142 1727204125.20524: generating all_blocks data 16142 1727204125.20525: done generating all_blocks data 16142 1727204125.20529: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204125.20530: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204125.20531: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16142 1727204125.21154: done processing included file 16142 1727204125.21156: iterating over new_blocks loaded from include file 16142 1727204125.21158: in VariableManager get_vars() 16142 1727204125.21178: done with get_vars() 16142 1727204125.21180: filtering new block on tags 16142 1727204125.21195: done filtering new block on tags 16142 1727204125.21197: in VariableManager get_vars() 16142 1727204125.21213: done with get_vars() 16142 1727204125.21214: filtering new block on tags 16142 1727204125.21226: done filtering new block on tags 16142 1727204125.21228: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 16142 1727204125.21232: extending task lists for all hosts with included blocks 16142 1727204125.21342: done extending task lists 16142 1727204125.21343: done processing included files 16142 1727204125.21344: results queue empty 16142 1727204125.21344: checking for any_errors_fatal 16142 1727204125.21346: done checking for any_errors_fatal 16142 1727204125.21347: checking for max_fail_percentage 16142 1727204125.21347: done checking for max_fail_percentage 16142 1727204125.21348: checking to see if all hosts have failed and the running result is not ok 16142 1727204125.21348: done checking to see if all hosts have failed 16142 1727204125.21349: getting the remaining hosts for this loop 16142 1727204125.21350: done getting the remaining hosts for this loop 16142 1727204125.21351: getting the next task for host managed-node2 16142 1727204125.21354: done getting next task for host managed-node2 16142 1727204125.21356: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 16142 1727204125.21358: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204125.21361: getting variables 16142 1727204125.21362: in VariableManager get_vars() 16142 1727204125.21377: Calling all_inventory to load vars for managed-node2 16142 1727204125.21379: Calling groups_inventory to load vars for managed-node2 16142 1727204125.21380: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204125.21384: Calling all_plugins_play to load vars for managed-node2 16142 1727204125.21386: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204125.21388: Calling groups_plugins_play to load vars for managed-node2 16142 1727204125.22338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204125.24408: done with get_vars() 16142 1727204125.24432: done getting variables 16142 1727204125.24476: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.107) 0:00:24.422 ***** 16142 1727204125.24521: entering _queue_task() for managed-node2/set_fact 16142 1727204125.24761: worker is 1 (out of 1 available) 16142 1727204125.24775: exiting _queue_task() for managed-node2/set_fact 16142 1727204125.24787: done queuing things up, now waiting for results queue to drain 16142 1727204125.24788: waiting for pending results... 16142 1727204125.25159: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 16142 1727204125.25270: in run() - task 0affcd87-79f5-fddd-f6c7-000000000674 16142 1727204125.25280: variable 'ansible_search_path' from source: unknown 16142 1727204125.25284: variable 'ansible_search_path' from source: unknown 16142 1727204125.25316: calling self._execute() 16142 1727204125.25480: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.25502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.25522: variable 'omit' from source: magic vars 16142 1727204125.25853: variable 'ansible_distribution_major_version' from source: facts 16142 1727204125.25876: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204125.25889: variable 'omit' from source: magic vars 16142 1727204125.25925: variable 'omit' from source: magic vars 16142 1727204125.25952: variable 'omit' from source: magic vars 16142 1727204125.25988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204125.26016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204125.26035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204125.26051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.26061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.26088: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204125.26092: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.26095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.26172: Set connection var ansible_timeout to 10 16142 1727204125.26176: Set connection var ansible_connection to ssh 16142 1727204125.26178: Set connection var ansible_shell_type to sh 16142 1727204125.26185: Set connection var ansible_shell_executable to /bin/sh 16142 1727204125.26188: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204125.26196: Set connection var ansible_pipelining to False 16142 1727204125.26214: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.26217: variable 'ansible_connection' from source: unknown 16142 1727204125.26220: variable 'ansible_module_compression' from source: unknown 16142 1727204125.26222: variable 'ansible_shell_type' from source: unknown 16142 1727204125.26225: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.26227: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.26229: variable 'ansible_pipelining' from source: unknown 16142 1727204125.26232: variable 'ansible_timeout' from source: unknown 16142 1727204125.26239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.26344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204125.26352: variable 'omit' from source: magic vars 16142 1727204125.26359: starting attempt loop 16142 1727204125.26361: running the handler 16142 1727204125.26373: handler run complete 16142 1727204125.26382: attempt loop complete, returning result 16142 1727204125.26385: _execute() done 16142 1727204125.26387: dumping result to json 16142 1727204125.26389: done dumping result, returning 16142 1727204125.26396: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-fddd-f6c7-000000000674] 16142 1727204125.26402: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000674 16142 1727204125.26490: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000674 16142 1727204125.26493: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 16142 1727204125.26554: no more pending results, returning what we have 16142 1727204125.26557: results queue empty 16142 1727204125.26558: checking for any_errors_fatal 16142 1727204125.26559: done checking for any_errors_fatal 16142 1727204125.26560: checking for max_fail_percentage 16142 1727204125.26561: done checking for max_fail_percentage 16142 1727204125.26562: checking to see if all hosts have failed and the running result is not ok 16142 1727204125.26563: done checking to see if all hosts have failed 16142 1727204125.26565: getting the remaining hosts for this loop 16142 1727204125.26566: done getting the remaining hosts for this loop 16142 1727204125.26570: getting the next task for host managed-node2 16142 1727204125.26582: done getting next task for host managed-node2 16142 1727204125.26588: ^ task is: TASK: Stat profile file 16142 1727204125.26593: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204125.26600: getting variables 16142 1727204125.26604: in VariableManager get_vars() 16142 1727204125.26661: Calling all_inventory to load vars for managed-node2 16142 1727204125.26666: Calling groups_inventory to load vars for managed-node2 16142 1727204125.26668: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204125.26678: Calling all_plugins_play to load vars for managed-node2 16142 1727204125.26680: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204125.26682: Calling groups_plugins_play to load vars for managed-node2 16142 1727204125.27732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204125.28863: done with get_vars() 16142 1727204125.28883: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.044) 0:00:24.466 ***** 16142 1727204125.28953: entering _queue_task() for managed-node2/stat 16142 1727204125.29200: worker is 1 (out of 1 available) 16142 1727204125.29213: exiting _queue_task() for managed-node2/stat 16142 1727204125.29226: done queuing things up, now waiting for results queue to drain 16142 1727204125.29228: waiting for pending results... 16142 1727204125.29423: running TaskExecutor() for managed-node2/TASK: Stat profile file 16142 1727204125.29503: in run() - task 0affcd87-79f5-fddd-f6c7-000000000675 16142 1727204125.29515: variable 'ansible_search_path' from source: unknown 16142 1727204125.29518: variable 'ansible_search_path' from source: unknown 16142 1727204125.29551: calling self._execute() 16142 1727204125.29623: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.29627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.29637: variable 'omit' from source: magic vars 16142 1727204125.29988: variable 'ansible_distribution_major_version' from source: facts 16142 1727204125.30015: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204125.30028: variable 'omit' from source: magic vars 16142 1727204125.30081: variable 'omit' from source: magic vars 16142 1727204125.30194: variable 'profile' from source: include params 16142 1727204125.30198: variable 'item' from source: include params 16142 1727204125.30259: variable 'item' from source: include params 16142 1727204125.30282: variable 'omit' from source: magic vars 16142 1727204125.30309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204125.30339: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204125.30377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204125.30381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.30408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.30438: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204125.30441: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.30460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.30563: Set connection var ansible_timeout to 10 16142 1727204125.30568: Set connection var ansible_connection to ssh 16142 1727204125.30571: Set connection var ansible_shell_type to sh 16142 1727204125.30573: Set connection var ansible_shell_executable to /bin/sh 16142 1727204125.30581: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204125.30586: Set connection var ansible_pipelining to False 16142 1727204125.30639: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.30642: variable 'ansible_connection' from source: unknown 16142 1727204125.30645: variable 'ansible_module_compression' from source: unknown 16142 1727204125.30647: variable 'ansible_shell_type' from source: unknown 16142 1727204125.30649: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.30651: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.30653: variable 'ansible_pipelining' from source: unknown 16142 1727204125.30655: variable 'ansible_timeout' from source: unknown 16142 1727204125.30659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.30802: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204125.30812: variable 'omit' from source: magic vars 16142 1727204125.30818: starting attempt loop 16142 1727204125.30822: running the handler 16142 1727204125.30837: _low_level_execute_command(): starting 16142 1727204125.30839: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204125.31453: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204125.31467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.31478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.31487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.31550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.31554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.31608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.31612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.31732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.33347: stdout chunk (state=3): >>>/root <<< 16142 1727204125.33546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.33554: stdout chunk (state=3): >>><<< 16142 1727204125.33560: stderr chunk (state=3): >>><<< 16142 1727204125.33671: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204125.33676: _low_level_execute_command(): starting 16142 1727204125.33680: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361 `" && echo ansible-tmp-1727204125.3359065-18053-220993218072361="` echo /root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361 `" ) && sleep 0' 16142 1727204125.34357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204125.34374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.34396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.34418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.34489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.34502: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204125.34518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.34540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204125.34553: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204125.34570: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204125.34589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.34608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.34625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.34641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.34653: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204125.34671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.34759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.34787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.34813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.34891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.36760: stdout chunk (state=3): >>>ansible-tmp-1727204125.3359065-18053-220993218072361=/root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361 <<< 16142 1727204125.36988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.36991: stdout chunk (state=3): >>><<< 16142 1727204125.36994: stderr chunk (state=3): >>><<< 16142 1727204125.37273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204125.3359065-18053-220993218072361=/root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204125.37277: variable 'ansible_module_compression' from source: unknown 16142 1727204125.37279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16142 1727204125.37282: variable 'ansible_facts' from source: unknown 16142 1727204125.37284: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361/AnsiballZ_stat.py 16142 1727204125.37453: Sending initial data 16142 1727204125.37456: Sent initial data (153 bytes) 16142 1727204125.38550: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204125.38568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.38589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.38608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.38651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.38663: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204125.38680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.38703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204125.38717: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204125.38728: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204125.38741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.38756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.38774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.38787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.38799: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204125.38819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.38898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.38915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.38943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.39062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.40757: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204125.40833: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204125.40845: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp0n6gcrjj /root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361/AnsiballZ_stat.py <<< 16142 1727204125.40871: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204125.41993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.42183: stderr chunk (state=3): >>><<< 16142 1727204125.42187: stdout chunk (state=3): >>><<< 16142 1727204125.42200: done transferring module to remote 16142 1727204125.42204: _low_level_execute_command(): starting 16142 1727204125.42207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361/ /root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361/AnsiballZ_stat.py && sleep 0' 16142 1727204125.42805: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204125.42821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.42843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.42860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.42923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.42938: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204125.42955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.42985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204125.42998: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204125.43008: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204125.43036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.43052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.43071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.43092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.43104: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204125.43119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.43257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.43291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.43306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.43404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.45098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.45169: stderr chunk (state=3): >>><<< 16142 1727204125.45184: stdout chunk (state=3): >>><<< 16142 1727204125.45206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204125.45209: _low_level_execute_command(): starting 16142 1727204125.45215: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361/AnsiballZ_stat.py && sleep 0' 16142 1727204125.45888: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204125.45898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.45908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.45923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.45974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.45982: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204125.45992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.46004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204125.46017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204125.46020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204125.46025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.46034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.46049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.46056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.46062: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204125.46078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.46151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.46168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.46182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.46253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.59281: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16142 1727204125.60358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204125.60362: stdout chunk (state=3): >>><<< 16142 1727204125.60369: stderr chunk (state=3): >>><<< 16142 1727204125.60415: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204125.60443: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204125.60454: _low_level_execute_command(): starting 16142 1727204125.60459: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204125.3359065-18053-220993218072361/ > /dev/null 2>&1 && sleep 0' 16142 1727204125.61127: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204125.61138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.61148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.61162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.61203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.61210: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204125.61222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.61238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204125.61241: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204125.61249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204125.61257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.61267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.61282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.61289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.61294: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204125.61303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.61375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.61395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.61406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.61482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.63299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.63397: stderr chunk (state=3): >>><<< 16142 1727204125.63409: stdout chunk (state=3): >>><<< 16142 1727204125.63678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204125.63681: handler run complete 16142 1727204125.63684: attempt loop complete, returning result 16142 1727204125.63686: _execute() done 16142 1727204125.63688: dumping result to json 16142 1727204125.63691: done dumping result, returning 16142 1727204125.63693: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-fddd-f6c7-000000000675] 16142 1727204125.63695: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000675 16142 1727204125.63779: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000675 16142 1727204125.63783: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16142 1727204125.63851: no more pending results, returning what we have 16142 1727204125.63856: results queue empty 16142 1727204125.63857: checking for any_errors_fatal 16142 1727204125.63866: done checking for any_errors_fatal 16142 1727204125.63867: checking for max_fail_percentage 16142 1727204125.63870: done checking for max_fail_percentage 16142 1727204125.63870: checking to see if all hosts have failed and the running result is not ok 16142 1727204125.63871: done checking to see if all hosts have failed 16142 1727204125.63872: getting the remaining hosts for this loop 16142 1727204125.63873: done getting the remaining hosts for this loop 16142 1727204125.63877: getting the next task for host managed-node2 16142 1727204125.63884: done getting next task for host managed-node2 16142 1727204125.63887: ^ task is: TASK: Set NM profile exist flag based on the profile files 16142 1727204125.63893: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204125.63897: getting variables 16142 1727204125.63899: in VariableManager get_vars() 16142 1727204125.63962: Calling all_inventory to load vars for managed-node2 16142 1727204125.63967: Calling groups_inventory to load vars for managed-node2 16142 1727204125.63970: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204125.63981: Calling all_plugins_play to load vars for managed-node2 16142 1727204125.63984: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204125.63987: Calling groups_plugins_play to load vars for managed-node2 16142 1727204125.65690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204125.67578: done with get_vars() 16142 1727204125.67608: done getting variables 16142 1727204125.67688: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.387) 0:00:24.854 ***** 16142 1727204125.67721: entering _queue_task() for managed-node2/set_fact 16142 1727204125.68090: worker is 1 (out of 1 available) 16142 1727204125.68107: exiting _queue_task() for managed-node2/set_fact 16142 1727204125.68120: done queuing things up, now waiting for results queue to drain 16142 1727204125.68121: waiting for pending results... 16142 1727204125.68425: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 16142 1727204125.68562: in run() - task 0affcd87-79f5-fddd-f6c7-000000000676 16142 1727204125.68586: variable 'ansible_search_path' from source: unknown 16142 1727204125.68593: variable 'ansible_search_path' from source: unknown 16142 1727204125.68636: calling self._execute() 16142 1727204125.68744: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.68762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.68785: variable 'omit' from source: magic vars 16142 1727204125.69217: variable 'ansible_distribution_major_version' from source: facts 16142 1727204125.69238: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204125.69429: variable 'profile_stat' from source: set_fact 16142 1727204125.69452: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204125.69459: when evaluation is False, skipping this task 16142 1727204125.69475: _execute() done 16142 1727204125.69484: dumping result to json 16142 1727204125.69491: done dumping result, returning 16142 1727204125.69501: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-fddd-f6c7-000000000676] 16142 1727204125.69512: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000676 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204125.69691: no more pending results, returning what we have 16142 1727204125.69695: results queue empty 16142 1727204125.69696: checking for any_errors_fatal 16142 1727204125.69703: done checking for any_errors_fatal 16142 1727204125.69703: checking for max_fail_percentage 16142 1727204125.69705: done checking for max_fail_percentage 16142 1727204125.69706: checking to see if all hosts have failed and the running result is not ok 16142 1727204125.69707: done checking to see if all hosts have failed 16142 1727204125.69708: getting the remaining hosts for this loop 16142 1727204125.69709: done getting the remaining hosts for this loop 16142 1727204125.69716: getting the next task for host managed-node2 16142 1727204125.69723: done getting next task for host managed-node2 16142 1727204125.69726: ^ task is: TASK: Get NM profile info 16142 1727204125.69731: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204125.69739: getting variables 16142 1727204125.69742: in VariableManager get_vars() 16142 1727204125.69803: Calling all_inventory to load vars for managed-node2 16142 1727204125.69807: Calling groups_inventory to load vars for managed-node2 16142 1727204125.69809: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204125.69822: Calling all_plugins_play to load vars for managed-node2 16142 1727204125.69824: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204125.69827: Calling groups_plugins_play to load vars for managed-node2 16142 1727204125.70873: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000676 16142 1727204125.70876: WORKER PROCESS EXITING 16142 1727204125.71949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204125.74352: done with get_vars() 16142 1727204125.74390: done getting variables 16142 1727204125.74470: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.067) 0:00:24.921 ***** 16142 1727204125.74503: entering _queue_task() for managed-node2/shell 16142 1727204125.74830: worker is 1 (out of 1 available) 16142 1727204125.74842: exiting _queue_task() for managed-node2/shell 16142 1727204125.74855: done queuing things up, now waiting for results queue to drain 16142 1727204125.74857: waiting for pending results... 16142 1727204125.75147: running TaskExecutor() for managed-node2/TASK: Get NM profile info 16142 1727204125.75251: in run() - task 0affcd87-79f5-fddd-f6c7-000000000677 16142 1727204125.75266: variable 'ansible_search_path' from source: unknown 16142 1727204125.75270: variable 'ansible_search_path' from source: unknown 16142 1727204125.75309: calling self._execute() 16142 1727204125.75413: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.75420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.75429: variable 'omit' from source: magic vars 16142 1727204125.75815: variable 'ansible_distribution_major_version' from source: facts 16142 1727204125.75828: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204125.75838: variable 'omit' from source: magic vars 16142 1727204125.75887: variable 'omit' from source: magic vars 16142 1727204125.75991: variable 'profile' from source: include params 16142 1727204125.76045: variable 'item' from source: include params 16142 1727204125.76071: variable 'item' from source: include params 16142 1727204125.76097: variable 'omit' from source: magic vars 16142 1727204125.76155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204125.76177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204125.76206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204125.76221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.76232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204125.76548: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204125.76552: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.76556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.76559: Set connection var ansible_timeout to 10 16142 1727204125.76562: Set connection var ansible_connection to ssh 16142 1727204125.76570: Set connection var ansible_shell_type to sh 16142 1727204125.76573: Set connection var ansible_shell_executable to /bin/sh 16142 1727204125.76575: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204125.76577: Set connection var ansible_pipelining to False 16142 1727204125.76578: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.76580: variable 'ansible_connection' from source: unknown 16142 1727204125.76582: variable 'ansible_module_compression' from source: unknown 16142 1727204125.76584: variable 'ansible_shell_type' from source: unknown 16142 1727204125.76585: variable 'ansible_shell_executable' from source: unknown 16142 1727204125.76587: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204125.76589: variable 'ansible_pipelining' from source: unknown 16142 1727204125.76591: variable 'ansible_timeout' from source: unknown 16142 1727204125.76592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204125.77119: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204125.77130: variable 'omit' from source: magic vars 16142 1727204125.77137: starting attempt loop 16142 1727204125.77140: running the handler 16142 1727204125.77148: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204125.77169: _low_level_execute_command(): starting 16142 1727204125.77176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204125.78130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204125.78141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.78151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.78172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.78209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.78222: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204125.78225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.78238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204125.78246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204125.78252: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204125.78260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.78271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.78287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.78294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.78301: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204125.78311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.78383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.78402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.78413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.78483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.80089: stdout chunk (state=3): >>>/root <<< 16142 1727204125.80214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.80265: stderr chunk (state=3): >>><<< 16142 1727204125.80269: stdout chunk (state=3): >>><<< 16142 1727204125.80286: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204125.80300: _low_level_execute_command(): starting 16142 1727204125.80315: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404 `" && echo ansible-tmp-1727204125.8028762-18075-234470272494404="` echo /root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404 `" ) && sleep 0' 16142 1727204125.81070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.81082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.81126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.81132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.81151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.81158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.81241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.81257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.81260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.81341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.83203: stdout chunk (state=3): >>>ansible-tmp-1727204125.8028762-18075-234470272494404=/root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404 <<< 16142 1727204125.83314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.83549: stderr chunk (state=3): >>><<< 16142 1727204125.83553: stdout chunk (state=3): >>><<< 16142 1727204125.83591: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204125.8028762-18075-234470272494404=/root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204125.83634: variable 'ansible_module_compression' from source: unknown 16142 1727204125.83717: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204125.83768: variable 'ansible_facts' from source: unknown 16142 1727204125.83875: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404/AnsiballZ_command.py 16142 1727204125.84489: Sending initial data 16142 1727204125.84492: Sent initial data (156 bytes) 16142 1727204125.85881: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.85888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.85927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.85935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.85949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.85955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.86028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.86041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.86046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.86461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.87808: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204125.87833: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204125.87879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpljg89g3q /root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404/AnsiballZ_command.py <<< 16142 1727204125.87909: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204125.89024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.89216: stderr chunk (state=3): >>><<< 16142 1727204125.89220: stdout chunk (state=3): >>><<< 16142 1727204125.89222: done transferring module to remote 16142 1727204125.89225: _low_level_execute_command(): starting 16142 1727204125.89227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404/ /root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404/AnsiballZ_command.py && sleep 0' 16142 1727204125.90217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.90221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.90261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204125.90271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204125.90274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204125.90276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.90340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204125.90344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.90399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204125.92175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204125.92179: stdout chunk (state=3): >>><<< 16142 1727204125.92186: stderr chunk (state=3): >>><<< 16142 1727204125.92205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204125.92208: _low_level_execute_command(): starting 16142 1727204125.92211: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404/AnsiballZ_command.py && sleep 0' 16142 1727204125.93908: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204125.93917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.93940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.93985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.94023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.94070: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204125.94076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.94091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204125.94152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204125.94160: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204125.94175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204125.94185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204125.94227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204125.94230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204125.94232: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204125.94234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204125.94372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204125.94393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204125.94468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204126.09806: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:55:26.074962", "end": "2024-09-24 14:55:26.097436", "delta": "0:00:00.022474", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204126.11081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204126.11085: stdout chunk (state=3): >>><<< 16142 1727204126.11091: stderr chunk (state=3): >>><<< 16142 1727204126.11112: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:55:26.074962", "end": "2024-09-24 14:55:26.097436", "delta": "0:00:00.022474", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204126.11154: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204126.11163: _low_level_execute_command(): starting 16142 1727204126.11173: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204125.8028762-18075-234470272494404/ > /dev/null 2>&1 && sleep 0' 16142 1727204126.13368: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204126.13377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204126.14087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204126.14101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204126.14152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204126.14159: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204126.14171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204126.14185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204126.14192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204126.14199: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204126.14206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204126.14215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204126.14226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204126.14233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204126.14244: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204126.14254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204126.14330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204126.14354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204126.14365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204126.14434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204126.16289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204126.16308: stderr chunk (state=3): >>><<< 16142 1727204126.16312: stdout chunk (state=3): >>><<< 16142 1727204126.16330: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204126.16340: handler run complete 16142 1727204126.16366: Evaluated conditional (False): False 16142 1727204126.16377: attempt loop complete, returning result 16142 1727204126.16380: _execute() done 16142 1727204126.16382: dumping result to json 16142 1727204126.16387: done dumping result, returning 16142 1727204126.16396: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-fddd-f6c7-000000000677] 16142 1727204126.16402: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000677 16142 1727204126.16513: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000677 16142 1727204126.16515: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.022474", "end": "2024-09-24 14:55:26.097436", "rc": 0, "start": "2024-09-24 14:55:26.074962" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 16142 1727204126.16599: no more pending results, returning what we have 16142 1727204126.16603: results queue empty 16142 1727204126.16604: checking for any_errors_fatal 16142 1727204126.16610: done checking for any_errors_fatal 16142 1727204126.16610: checking for max_fail_percentage 16142 1727204126.16613: done checking for max_fail_percentage 16142 1727204126.16614: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.16615: done checking to see if all hosts have failed 16142 1727204126.16615: getting the remaining hosts for this loop 16142 1727204126.16617: done getting the remaining hosts for this loop 16142 1727204126.16620: getting the next task for host managed-node2 16142 1727204126.16626: done getting next task for host managed-node2 16142 1727204126.16628: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16142 1727204126.16633: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.16636: getting variables 16142 1727204126.16638: in VariableManager get_vars() 16142 1727204126.16692: Calling all_inventory to load vars for managed-node2 16142 1727204126.16695: Calling groups_inventory to load vars for managed-node2 16142 1727204126.16697: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.16707: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.16709: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.16712: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.20400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.25344: done with get_vars() 16142 1727204126.25495: done getting variables 16142 1727204126.25560: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.520) 0:00:25.442 ***** 16142 1727204126.26558: entering _queue_task() for managed-node2/set_fact 16142 1727204126.27576: worker is 1 (out of 1 available) 16142 1727204126.27591: exiting _queue_task() for managed-node2/set_fact 16142 1727204126.27604: done queuing things up, now waiting for results queue to drain 16142 1727204126.27606: waiting for pending results... 16142 1727204126.28356: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16142 1727204126.28669: in run() - task 0affcd87-79f5-fddd-f6c7-000000000678 16142 1727204126.28684: variable 'ansible_search_path' from source: unknown 16142 1727204126.28687: variable 'ansible_search_path' from source: unknown 16142 1727204126.28725: calling self._execute() 16142 1727204126.29123: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.29128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.29141: variable 'omit' from source: magic vars 16142 1727204126.29521: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.29534: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.29675: variable 'nm_profile_exists' from source: set_fact 16142 1727204126.29689: Evaluated conditional (nm_profile_exists.rc == 0): True 16142 1727204126.29697: variable 'omit' from source: magic vars 16142 1727204126.29746: variable 'omit' from source: magic vars 16142 1727204126.30386: variable 'omit' from source: magic vars 16142 1727204126.30429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204126.30467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204126.30490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204126.30508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.30520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.30557: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204126.30560: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.30563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.30877: Set connection var ansible_timeout to 10 16142 1727204126.30881: Set connection var ansible_connection to ssh 16142 1727204126.30884: Set connection var ansible_shell_type to sh 16142 1727204126.30894: Set connection var ansible_shell_executable to /bin/sh 16142 1727204126.30897: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204126.30899: Set connection var ansible_pipelining to False 16142 1727204126.30927: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.30931: variable 'ansible_connection' from source: unknown 16142 1727204126.30933: variable 'ansible_module_compression' from source: unknown 16142 1727204126.30938: variable 'ansible_shell_type' from source: unknown 16142 1727204126.30940: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.30943: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.30948: variable 'ansible_pipelining' from source: unknown 16142 1727204126.30950: variable 'ansible_timeout' from source: unknown 16142 1727204126.30955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.31509: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204126.31520: variable 'omit' from source: magic vars 16142 1727204126.31526: starting attempt loop 16142 1727204126.31529: running the handler 16142 1727204126.31548: handler run complete 16142 1727204126.31557: attempt loop complete, returning result 16142 1727204126.31561: _execute() done 16142 1727204126.31563: dumping result to json 16142 1727204126.31568: done dumping result, returning 16142 1727204126.31985: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-fddd-f6c7-000000000678] 16142 1727204126.31989: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000678 16142 1727204126.32076: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000678 16142 1727204126.32079: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 16142 1727204126.32141: no more pending results, returning what we have 16142 1727204126.32145: results queue empty 16142 1727204126.32145: checking for any_errors_fatal 16142 1727204126.32154: done checking for any_errors_fatal 16142 1727204126.32154: checking for max_fail_percentage 16142 1727204126.32157: done checking for max_fail_percentage 16142 1727204126.32158: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.32158: done checking to see if all hosts have failed 16142 1727204126.32159: getting the remaining hosts for this loop 16142 1727204126.32160: done getting the remaining hosts for this loop 16142 1727204126.32166: getting the next task for host managed-node2 16142 1727204126.32174: done getting next task for host managed-node2 16142 1727204126.32176: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 16142 1727204126.32181: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.32185: getting variables 16142 1727204126.32187: in VariableManager get_vars() 16142 1727204126.32242: Calling all_inventory to load vars for managed-node2 16142 1727204126.32245: Calling groups_inventory to load vars for managed-node2 16142 1727204126.32247: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.32256: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.32258: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.32260: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.34672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.38359: done with get_vars() 16142 1727204126.38400: done getting variables 16142 1727204126.38589: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204126.38848: variable 'profile' from source: include params 16142 1727204126.38852: variable 'item' from source: include params 16142 1727204126.39038: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.125) 0:00:25.567 ***** 16142 1727204126.39200: entering _queue_task() for managed-node2/command 16142 1727204126.39901: worker is 1 (out of 1 available) 16142 1727204126.39914: exiting _queue_task() for managed-node2/command 16142 1727204126.39926: done queuing things up, now waiting for results queue to drain 16142 1727204126.39927: waiting for pending results... 16142 1727204126.40336: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 16142 1727204126.40459: in run() - task 0affcd87-79f5-fddd-f6c7-00000000067a 16142 1727204126.40480: variable 'ansible_search_path' from source: unknown 16142 1727204126.40484: variable 'ansible_search_path' from source: unknown 16142 1727204126.40524: calling self._execute() 16142 1727204126.40629: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.40633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.40646: variable 'omit' from source: magic vars 16142 1727204126.41935: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.41939: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.41942: variable 'profile_stat' from source: set_fact 16142 1727204126.41944: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204126.41947: when evaluation is False, skipping this task 16142 1727204126.41949: _execute() done 16142 1727204126.41951: dumping result to json 16142 1727204126.41954: done dumping result, returning 16142 1727204126.41956: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affcd87-79f5-fddd-f6c7-00000000067a] 16142 1727204126.41959: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000067a 16142 1727204126.42041: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000067a 16142 1727204126.42045: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204126.42098: no more pending results, returning what we have 16142 1727204126.42102: results queue empty 16142 1727204126.42103: checking for any_errors_fatal 16142 1727204126.42107: done checking for any_errors_fatal 16142 1727204126.42108: checking for max_fail_percentage 16142 1727204126.42110: done checking for max_fail_percentage 16142 1727204126.42111: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.42112: done checking to see if all hosts have failed 16142 1727204126.42113: getting the remaining hosts for this loop 16142 1727204126.42114: done getting the remaining hosts for this loop 16142 1727204126.42118: getting the next task for host managed-node2 16142 1727204126.42125: done getting next task for host managed-node2 16142 1727204126.42128: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 16142 1727204126.42132: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.42137: getting variables 16142 1727204126.42138: in VariableManager get_vars() 16142 1727204126.42197: Calling all_inventory to load vars for managed-node2 16142 1727204126.42200: Calling groups_inventory to load vars for managed-node2 16142 1727204126.42202: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.42213: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.42216: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.42219: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.45236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.53493: done with get_vars() 16142 1727204126.53523: done getting variables 16142 1727204126.53575: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204126.53674: variable 'profile' from source: include params 16142 1727204126.53677: variable 'item' from source: include params 16142 1727204126.53736: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.146) 0:00:25.714 ***** 16142 1727204126.53766: entering _queue_task() for managed-node2/set_fact 16142 1727204126.54562: worker is 1 (out of 1 available) 16142 1727204126.54578: exiting _queue_task() for managed-node2/set_fact 16142 1727204126.54597: done queuing things up, now waiting for results queue to drain 16142 1727204126.54599: waiting for pending results... 16142 1727204126.55055: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 16142 1727204126.55141: in run() - task 0affcd87-79f5-fddd-f6c7-00000000067b 16142 1727204126.55151: variable 'ansible_search_path' from source: unknown 16142 1727204126.55155: variable 'ansible_search_path' from source: unknown 16142 1727204126.55190: calling self._execute() 16142 1727204126.55269: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.55276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.55285: variable 'omit' from source: magic vars 16142 1727204126.55570: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.55580: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.55669: variable 'profile_stat' from source: set_fact 16142 1727204126.55682: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204126.55688: when evaluation is False, skipping this task 16142 1727204126.55691: _execute() done 16142 1727204126.55695: dumping result to json 16142 1727204126.55697: done dumping result, returning 16142 1727204126.55701: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affcd87-79f5-fddd-f6c7-00000000067b] 16142 1727204126.55706: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000067b 16142 1727204126.55801: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000067b 16142 1727204126.55804: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204126.55873: no more pending results, returning what we have 16142 1727204126.55877: results queue empty 16142 1727204126.55878: checking for any_errors_fatal 16142 1727204126.55885: done checking for any_errors_fatal 16142 1727204126.55886: checking for max_fail_percentage 16142 1727204126.55889: done checking for max_fail_percentage 16142 1727204126.55889: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.55890: done checking to see if all hosts have failed 16142 1727204126.55891: getting the remaining hosts for this loop 16142 1727204126.55892: done getting the remaining hosts for this loop 16142 1727204126.55896: getting the next task for host managed-node2 16142 1727204126.55902: done getting next task for host managed-node2 16142 1727204126.55905: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 16142 1727204126.55908: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.55911: getting variables 16142 1727204126.55920: in VariableManager get_vars() 16142 1727204126.55975: Calling all_inventory to load vars for managed-node2 16142 1727204126.55977: Calling groups_inventory to load vars for managed-node2 16142 1727204126.55979: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.55989: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.55992: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.55994: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.57185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.59385: done with get_vars() 16142 1727204126.59413: done getting variables 16142 1727204126.59459: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204126.59551: variable 'profile' from source: include params 16142 1727204126.59554: variable 'item' from source: include params 16142 1727204126.59598: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.058) 0:00:25.773 ***** 16142 1727204126.59626: entering _queue_task() for managed-node2/command 16142 1727204126.59874: worker is 1 (out of 1 available) 16142 1727204126.59888: exiting _queue_task() for managed-node2/command 16142 1727204126.59903: done queuing things up, now waiting for results queue to drain 16142 1727204126.59904: waiting for pending results... 16142 1727204126.60097: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 16142 1727204126.60189: in run() - task 0affcd87-79f5-fddd-f6c7-00000000067c 16142 1727204126.60199: variable 'ansible_search_path' from source: unknown 16142 1727204126.60202: variable 'ansible_search_path' from source: unknown 16142 1727204126.60234: calling self._execute() 16142 1727204126.60790: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.60795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.60798: variable 'omit' from source: magic vars 16142 1727204126.60801: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.60804: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.61060: variable 'profile_stat' from source: set_fact 16142 1727204126.61065: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204126.61068: when evaluation is False, skipping this task 16142 1727204126.61070: _execute() done 16142 1727204126.61072: dumping result to json 16142 1727204126.61074: done dumping result, returning 16142 1727204126.61076: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affcd87-79f5-fddd-f6c7-00000000067c] 16142 1727204126.61078: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000067c 16142 1727204126.61153: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000067c 16142 1727204126.61157: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204126.61259: no more pending results, returning what we have 16142 1727204126.61267: results queue empty 16142 1727204126.61270: checking for any_errors_fatal 16142 1727204126.61275: done checking for any_errors_fatal 16142 1727204126.61275: checking for max_fail_percentage 16142 1727204126.61277: done checking for max_fail_percentage 16142 1727204126.61278: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.61279: done checking to see if all hosts have failed 16142 1727204126.61280: getting the remaining hosts for this loop 16142 1727204126.61281: done getting the remaining hosts for this loop 16142 1727204126.61287: getting the next task for host managed-node2 16142 1727204126.61293: done getting next task for host managed-node2 16142 1727204126.61296: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 16142 1727204126.61300: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.61304: getting variables 16142 1727204126.61305: in VariableManager get_vars() 16142 1727204126.61352: Calling all_inventory to load vars for managed-node2 16142 1727204126.61355: Calling groups_inventory to load vars for managed-node2 16142 1727204126.61357: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.61370: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.61372: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.61376: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.64033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.66016: done with get_vars() 16142 1727204126.66048: done getting variables 16142 1727204126.66135: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204126.66298: variable 'profile' from source: include params 16142 1727204126.66305: variable 'item' from source: include params 16142 1727204126.66393: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.068) 0:00:25.841 ***** 16142 1727204126.66441: entering _queue_task() for managed-node2/set_fact 16142 1727204126.66866: worker is 1 (out of 1 available) 16142 1727204126.66880: exiting _queue_task() for managed-node2/set_fact 16142 1727204126.66892: done queuing things up, now waiting for results queue to drain 16142 1727204126.66893: waiting for pending results... 16142 1727204126.67205: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 16142 1727204126.67347: in run() - task 0affcd87-79f5-fddd-f6c7-00000000067d 16142 1727204126.67373: variable 'ansible_search_path' from source: unknown 16142 1727204126.67410: variable 'ansible_search_path' from source: unknown 16142 1727204126.67423: calling self._execute() 16142 1727204126.67528: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.67547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.67568: variable 'omit' from source: magic vars 16142 1727204126.68066: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.68085: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.68259: variable 'profile_stat' from source: set_fact 16142 1727204126.68276: Evaluated conditional (profile_stat.stat.exists): False 16142 1727204126.68279: when evaluation is False, skipping this task 16142 1727204126.68284: _execute() done 16142 1727204126.68287: dumping result to json 16142 1727204126.68290: done dumping result, returning 16142 1727204126.68292: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affcd87-79f5-fddd-f6c7-00000000067d] 16142 1727204126.68295: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000067d skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16142 1727204126.68552: no more pending results, returning what we have 16142 1727204126.68557: results queue empty 16142 1727204126.68558: checking for any_errors_fatal 16142 1727204126.68572: done checking for any_errors_fatal 16142 1727204126.68575: checking for max_fail_percentage 16142 1727204126.68580: done checking for max_fail_percentage 16142 1727204126.68581: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.68582: done checking to see if all hosts have failed 16142 1727204126.68583: getting the remaining hosts for this loop 16142 1727204126.68585: done getting the remaining hosts for this loop 16142 1727204126.68589: getting the next task for host managed-node2 16142 1727204126.68599: done getting next task for host managed-node2 16142 1727204126.68608: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 16142 1727204126.68614: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.68620: getting variables 16142 1727204126.68622: in VariableManager get_vars() 16142 1727204126.68715: Calling all_inventory to load vars for managed-node2 16142 1727204126.68721: Calling groups_inventory to load vars for managed-node2 16142 1727204126.68724: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.68730: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000067d 16142 1727204126.68736: WORKER PROCESS EXITING 16142 1727204126.68760: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.68770: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.68775: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.70527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.71474: done with get_vars() 16142 1727204126.71492: done getting variables 16142 1727204126.71536: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204126.71627: variable 'profile' from source: include params 16142 1727204126.71630: variable 'item' from source: include params 16142 1727204126.71675: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.052) 0:00:25.893 ***** 16142 1727204126.71701: entering _queue_task() for managed-node2/assert 16142 1727204126.71932: worker is 1 (out of 1 available) 16142 1727204126.71946: exiting _queue_task() for managed-node2/assert 16142 1727204126.71959: done queuing things up, now waiting for results queue to drain 16142 1727204126.71960: waiting for pending results... 16142 1727204126.72147: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.1' 16142 1727204126.72224: in run() - task 0affcd87-79f5-fddd-f6c7-000000000364 16142 1727204126.72270: variable 'ansible_search_path' from source: unknown 16142 1727204126.72274: variable 'ansible_search_path' from source: unknown 16142 1727204126.72295: calling self._execute() 16142 1727204126.72376: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.72379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.72389: variable 'omit' from source: magic vars 16142 1727204126.72677: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.72688: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.72697: variable 'omit' from source: magic vars 16142 1727204126.72723: variable 'omit' from source: magic vars 16142 1727204126.72796: variable 'profile' from source: include params 16142 1727204126.72801: variable 'item' from source: include params 16142 1727204126.72851: variable 'item' from source: include params 16142 1727204126.72866: variable 'omit' from source: magic vars 16142 1727204126.72900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204126.72926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204126.72947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204126.72962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.72975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.72999: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204126.73003: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.73005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.73081: Set connection var ansible_timeout to 10 16142 1727204126.73085: Set connection var ansible_connection to ssh 16142 1727204126.73089: Set connection var ansible_shell_type to sh 16142 1727204126.73096: Set connection var ansible_shell_executable to /bin/sh 16142 1727204126.73099: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204126.73105: Set connection var ansible_pipelining to False 16142 1727204126.73123: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.73125: variable 'ansible_connection' from source: unknown 16142 1727204126.73129: variable 'ansible_module_compression' from source: unknown 16142 1727204126.73131: variable 'ansible_shell_type' from source: unknown 16142 1727204126.73135: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.73138: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.73143: variable 'ansible_pipelining' from source: unknown 16142 1727204126.73145: variable 'ansible_timeout' from source: unknown 16142 1727204126.73148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.73254: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204126.73263: variable 'omit' from source: magic vars 16142 1727204126.73270: starting attempt loop 16142 1727204126.73273: running the handler 16142 1727204126.73351: variable 'lsr_net_profile_exists' from source: set_fact 16142 1727204126.73356: Evaluated conditional (lsr_net_profile_exists): True 16142 1727204126.73362: handler run complete 16142 1727204126.73374: attempt loop complete, returning result 16142 1727204126.73377: _execute() done 16142 1727204126.73380: dumping result to json 16142 1727204126.73384: done dumping result, returning 16142 1727204126.73391: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.1' [0affcd87-79f5-fddd-f6c7-000000000364] 16142 1727204126.73395: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000364 16142 1727204126.73484: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000364 16142 1727204126.73487: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204126.73547: no more pending results, returning what we have 16142 1727204126.73551: results queue empty 16142 1727204126.73552: checking for any_errors_fatal 16142 1727204126.73558: done checking for any_errors_fatal 16142 1727204126.73558: checking for max_fail_percentage 16142 1727204126.73560: done checking for max_fail_percentage 16142 1727204126.73561: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.73561: done checking to see if all hosts have failed 16142 1727204126.73562: getting the remaining hosts for this loop 16142 1727204126.73566: done getting the remaining hosts for this loop 16142 1727204126.73570: getting the next task for host managed-node2 16142 1727204126.73580: done getting next task for host managed-node2 16142 1727204126.73583: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 16142 1727204126.73585: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.73591: getting variables 16142 1727204126.73592: in VariableManager get_vars() 16142 1727204126.73650: Calling all_inventory to load vars for managed-node2 16142 1727204126.73653: Calling groups_inventory to load vars for managed-node2 16142 1727204126.73655: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.73666: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.73669: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.73671: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.75326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.76814: done with get_vars() 16142 1727204126.76844: done getting variables 16142 1727204126.76895: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204126.76988: variable 'profile' from source: include params 16142 1727204126.76991: variable 'item' from source: include params 16142 1727204126.77032: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.053) 0:00:25.947 ***** 16142 1727204126.77072: entering _queue_task() for managed-node2/assert 16142 1727204126.77392: worker is 1 (out of 1 available) 16142 1727204126.77411: exiting _queue_task() for managed-node2/assert 16142 1727204126.77422: done queuing things up, now waiting for results queue to drain 16142 1727204126.77423: waiting for pending results... 16142 1727204126.77760: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' 16142 1727204126.77832: in run() - task 0affcd87-79f5-fddd-f6c7-000000000365 16142 1727204126.77846: variable 'ansible_search_path' from source: unknown 16142 1727204126.77851: variable 'ansible_search_path' from source: unknown 16142 1727204126.77890: calling self._execute() 16142 1727204126.78140: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.78147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.78173: variable 'omit' from source: magic vars 16142 1727204126.78626: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.78638: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.78644: variable 'omit' from source: magic vars 16142 1727204126.78691: variable 'omit' from source: magic vars 16142 1727204126.78780: variable 'profile' from source: include params 16142 1727204126.78784: variable 'item' from source: include params 16142 1727204126.78831: variable 'item' from source: include params 16142 1727204126.78861: variable 'omit' from source: magic vars 16142 1727204126.78893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204126.78920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204126.78940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204126.78953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.78963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.78995: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204126.78998: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.79001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.79068: Set connection var ansible_timeout to 10 16142 1727204126.79071: Set connection var ansible_connection to ssh 16142 1727204126.79075: Set connection var ansible_shell_type to sh 16142 1727204126.79080: Set connection var ansible_shell_executable to /bin/sh 16142 1727204126.79086: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204126.79091: Set connection var ansible_pipelining to False 16142 1727204126.79112: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.79115: variable 'ansible_connection' from source: unknown 16142 1727204126.79118: variable 'ansible_module_compression' from source: unknown 16142 1727204126.79120: variable 'ansible_shell_type' from source: unknown 16142 1727204126.79122: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.79124: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.79127: variable 'ansible_pipelining' from source: unknown 16142 1727204126.79129: variable 'ansible_timeout' from source: unknown 16142 1727204126.79132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.79239: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204126.79248: variable 'omit' from source: magic vars 16142 1727204126.79254: starting attempt loop 16142 1727204126.79257: running the handler 16142 1727204126.79335: variable 'lsr_net_profile_ansible_managed' from source: set_fact 16142 1727204126.79338: Evaluated conditional (lsr_net_profile_ansible_managed): True 16142 1727204126.79341: handler run complete 16142 1727204126.79353: attempt loop complete, returning result 16142 1727204126.79356: _execute() done 16142 1727204126.79360: dumping result to json 16142 1727204126.79362: done dumping result, returning 16142 1727204126.79370: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affcd87-79f5-fddd-f6c7-000000000365] 16142 1727204126.79375: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000365 16142 1727204126.79472: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000365 16142 1727204126.79477: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204126.79523: no more pending results, returning what we have 16142 1727204126.79527: results queue empty 16142 1727204126.79527: checking for any_errors_fatal 16142 1727204126.79537: done checking for any_errors_fatal 16142 1727204126.79538: checking for max_fail_percentage 16142 1727204126.79540: done checking for max_fail_percentage 16142 1727204126.79541: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.79541: done checking to see if all hosts have failed 16142 1727204126.79542: getting the remaining hosts for this loop 16142 1727204126.79543: done getting the remaining hosts for this loop 16142 1727204126.79547: getting the next task for host managed-node2 16142 1727204126.79553: done getting next task for host managed-node2 16142 1727204126.79555: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 16142 1727204126.79558: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.79562: getting variables 16142 1727204126.79565: in VariableManager get_vars() 16142 1727204126.79655: Calling all_inventory to load vars for managed-node2 16142 1727204126.79658: Calling groups_inventory to load vars for managed-node2 16142 1727204126.79660: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.79672: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.79675: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.79677: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.80629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.81923: done with get_vars() 16142 1727204126.81940: done getting variables 16142 1727204126.81985: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204126.82087: variable 'profile' from source: include params 16142 1727204126.82090: variable 'item' from source: include params 16142 1727204126.82134: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.051) 0:00:25.998 ***** 16142 1727204126.82162: entering _queue_task() for managed-node2/assert 16142 1727204126.82398: worker is 1 (out of 1 available) 16142 1727204126.82413: exiting _queue_task() for managed-node2/assert 16142 1727204126.82425: done queuing things up, now waiting for results queue to drain 16142 1727204126.82426: waiting for pending results... 16142 1727204126.82613: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.1 16142 1727204126.82716: in run() - task 0affcd87-79f5-fddd-f6c7-000000000366 16142 1727204126.82726: variable 'ansible_search_path' from source: unknown 16142 1727204126.82730: variable 'ansible_search_path' from source: unknown 16142 1727204126.82780: calling self._execute() 16142 1727204126.82897: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.82911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.82919: variable 'omit' from source: magic vars 16142 1727204126.83355: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.83367: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.83376: variable 'omit' from source: magic vars 16142 1727204126.83415: variable 'omit' from source: magic vars 16142 1727204126.83520: variable 'profile' from source: include params 16142 1727204126.83523: variable 'item' from source: include params 16142 1727204126.83578: variable 'item' from source: include params 16142 1727204126.83591: variable 'omit' from source: magic vars 16142 1727204126.83624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204126.83655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204126.83677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204126.83692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.83702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.83725: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204126.83729: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.83731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.83832: Set connection var ansible_timeout to 10 16142 1727204126.83838: Set connection var ansible_connection to ssh 16142 1727204126.83840: Set connection var ansible_shell_type to sh 16142 1727204126.83843: Set connection var ansible_shell_executable to /bin/sh 16142 1727204126.83849: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204126.83855: Set connection var ansible_pipelining to False 16142 1727204126.83875: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.83882: variable 'ansible_connection' from source: unknown 16142 1727204126.83885: variable 'ansible_module_compression' from source: unknown 16142 1727204126.83892: variable 'ansible_shell_type' from source: unknown 16142 1727204126.83895: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.83898: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.83901: variable 'ansible_pipelining' from source: unknown 16142 1727204126.83904: variable 'ansible_timeout' from source: unknown 16142 1727204126.83907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.84040: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204126.84050: variable 'omit' from source: magic vars 16142 1727204126.84056: starting attempt loop 16142 1727204126.84059: running the handler 16142 1727204126.84158: variable 'lsr_net_profile_fingerprint' from source: set_fact 16142 1727204126.84212: Evaluated conditional (lsr_net_profile_fingerprint): True 16142 1727204126.84216: handler run complete 16142 1727204126.84220: attempt loop complete, returning result 16142 1727204126.84222: _execute() done 16142 1727204126.84225: dumping result to json 16142 1727204126.84228: done dumping result, returning 16142 1727204126.84231: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.1 [0affcd87-79f5-fddd-f6c7-000000000366] 16142 1727204126.84244: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000366 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204126.84429: no more pending results, returning what we have 16142 1727204126.84433: results queue empty 16142 1727204126.84434: checking for any_errors_fatal 16142 1727204126.84442: done checking for any_errors_fatal 16142 1727204126.84443: checking for max_fail_percentage 16142 1727204126.84445: done checking for max_fail_percentage 16142 1727204126.84446: checking to see if all hosts have failed and the running result is not ok 16142 1727204126.84446: done checking to see if all hosts have failed 16142 1727204126.84447: getting the remaining hosts for this loop 16142 1727204126.84448: done getting the remaining hosts for this loop 16142 1727204126.84452: getting the next task for host managed-node2 16142 1727204126.84461: done getting next task for host managed-node2 16142 1727204126.84465: ^ task is: TASK: ** TEST check polling interval 16142 1727204126.84467: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204126.84472: getting variables 16142 1727204126.84474: in VariableManager get_vars() 16142 1727204126.84588: Calling all_inventory to load vars for managed-node2 16142 1727204126.84599: Calling groups_inventory to load vars for managed-node2 16142 1727204126.84612: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204126.84623: Calling all_plugins_play to load vars for managed-node2 16142 1727204126.84627: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204126.84631: Calling groups_plugins_play to load vars for managed-node2 16142 1727204126.85200: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000366 16142 1727204126.85204: WORKER PROCESS EXITING 16142 1727204126.86929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204126.90484: done with get_vars() 16142 1727204126.90514: done getting variables 16142 1727204126.90981: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.088) 0:00:26.086 ***** 16142 1727204126.91012: entering _queue_task() for managed-node2/command 16142 1727204126.91393: worker is 1 (out of 1 available) 16142 1727204126.91406: exiting _queue_task() for managed-node2/command 16142 1727204126.91419: done queuing things up, now waiting for results queue to drain 16142 1727204126.91421: waiting for pending results... 16142 1727204126.91999: running TaskExecutor() for managed-node2/TASK: ** TEST check polling interval 16142 1727204126.92087: in run() - task 0affcd87-79f5-fddd-f6c7-000000000071 16142 1727204126.92100: variable 'ansible_search_path' from source: unknown 16142 1727204126.92142: calling self._execute() 16142 1727204126.92242: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.92248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.92266: variable 'omit' from source: magic vars 16142 1727204126.92661: variable 'ansible_distribution_major_version' from source: facts 16142 1727204126.92676: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204126.92682: variable 'omit' from source: magic vars 16142 1727204126.92706: variable 'omit' from source: magic vars 16142 1727204126.92809: variable 'controller_device' from source: play vars 16142 1727204126.92828: variable 'omit' from source: magic vars 16142 1727204126.92875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204126.92909: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204126.92938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204126.92955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.92971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204126.93003: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204126.93006: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.93009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.93119: Set connection var ansible_timeout to 10 16142 1727204126.93122: Set connection var ansible_connection to ssh 16142 1727204126.93126: Set connection var ansible_shell_type to sh 16142 1727204126.93138: Set connection var ansible_shell_executable to /bin/sh 16142 1727204126.93145: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204126.93152: Set connection var ansible_pipelining to False 16142 1727204126.93177: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.93181: variable 'ansible_connection' from source: unknown 16142 1727204126.93183: variable 'ansible_module_compression' from source: unknown 16142 1727204126.93186: variable 'ansible_shell_type' from source: unknown 16142 1727204126.93188: variable 'ansible_shell_executable' from source: unknown 16142 1727204126.93190: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204126.93192: variable 'ansible_pipelining' from source: unknown 16142 1727204126.93197: variable 'ansible_timeout' from source: unknown 16142 1727204126.93201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204126.93337: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204126.93354: variable 'omit' from source: magic vars 16142 1727204126.93360: starting attempt loop 16142 1727204126.93363: running the handler 16142 1727204126.93380: _low_level_execute_command(): starting 16142 1727204126.93387: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204126.94126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204126.94146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204126.94156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204126.94167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204126.94207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204126.94216: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204126.94237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204126.94241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204126.94261: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204126.94266: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204126.94269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204126.94276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204126.94301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204126.94304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204126.94306: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204126.94309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204126.94484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204126.94502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204126.94514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204126.94590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204126.96266: stdout chunk (state=3): >>>/root <<< 16142 1727204126.96382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204126.96466: stderr chunk (state=3): >>><<< 16142 1727204126.96469: stdout chunk (state=3): >>><<< 16142 1727204126.96500: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204126.96511: _low_level_execute_command(): starting 16142 1727204126.96519: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163 `" && echo ansible-tmp-1727204126.9649782-18134-182137746525163="` echo /root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163 `" ) && sleep 0' 16142 1727204126.98305: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204126.98435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204126.98450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204126.98466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204126.98504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204126.98510: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204126.98520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204126.98541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204126.98549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204126.98556: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204126.98565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204126.98577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204126.98589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204126.98596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204126.98604: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204126.98611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204126.98774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204126.98793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204126.98806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204126.99171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.00774: stdout chunk (state=3): >>>ansible-tmp-1727204126.9649782-18134-182137746525163=/root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163 <<< 16142 1727204127.00966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.00975: stdout chunk (state=3): >>><<< 16142 1727204127.00985: stderr chunk (state=3): >>><<< 16142 1727204127.01005: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204126.9649782-18134-182137746525163=/root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.01040: variable 'ansible_module_compression' from source: unknown 16142 1727204127.01100: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204127.01143: variable 'ansible_facts' from source: unknown 16142 1727204127.01231: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163/AnsiballZ_command.py 16142 1727204127.01829: Sending initial data 16142 1727204127.01832: Sent initial data (156 bytes) 16142 1727204127.04630: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.04939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.04957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.04974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.05014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.05020: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204127.05032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.05051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.05058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.05067: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204127.05080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.05089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.05100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.05108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.05114: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204127.05123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.05205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.05226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.05241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.05308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.07035: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204127.07069: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204127.07114: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp1cku7kdt /root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163/AnsiballZ_command.py <<< 16142 1727204127.07143: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204127.08376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.08634: stderr chunk (state=3): >>><<< 16142 1727204127.08637: stdout chunk (state=3): >>><<< 16142 1727204127.08639: done transferring module to remote 16142 1727204127.08645: _low_level_execute_command(): starting 16142 1727204127.08647: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163/ /root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163/AnsiballZ_command.py && sleep 0' 16142 1727204127.09702: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.09707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.09867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204127.09871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.09874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204127.09876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.09926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.10057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.10084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.10140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.11869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.11944: stderr chunk (state=3): >>><<< 16142 1727204127.11948: stdout chunk (state=3): >>><<< 16142 1727204127.12057: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.12061: _low_level_execute_command(): starting 16142 1727204127.12065: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163/AnsiballZ_command.py && sleep 0' 16142 1727204127.13485: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.13489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.13521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204127.13524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.13526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204127.13528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.13710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.13811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.13863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.27553: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:55:27.271134", "end": "2024-09-24 14:55:27.274738", "delta": "0:00:00.003604", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204127.28894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204127.28898: stdout chunk (state=3): >>><<< 16142 1727204127.28903: stderr chunk (state=3): >>><<< 16142 1727204127.28921: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:55:27.271134", "end": "2024-09-24 14:55:27.274738", "delta": "0:00:00.003604", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204127.28968: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204127.28977: _low_level_execute_command(): starting 16142 1727204127.28979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204126.9649782-18134-182137746525163/ > /dev/null 2>&1 && sleep 0' 16142 1727204127.29946: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.29950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.30178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.30182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.30247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.30262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.30272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.30342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.32261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.32271: stderr chunk (state=3): >>><<< 16142 1727204127.32274: stdout chunk (state=3): >>><<< 16142 1727204127.32294: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.32301: handler run complete 16142 1727204127.32330: Evaluated conditional (False): False 16142 1727204127.32496: variable 'result' from source: unknown 16142 1727204127.32512: Evaluated conditional ('110' in result.stdout): True 16142 1727204127.32524: attempt loop complete, returning result 16142 1727204127.32529: _execute() done 16142 1727204127.32532: dumping result to json 16142 1727204127.32538: done dumping result, returning 16142 1727204127.32547: done running TaskExecutor() for managed-node2/TASK: ** TEST check polling interval [0affcd87-79f5-fddd-f6c7-000000000071] 16142 1727204127.32553: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000071 16142 1727204127.32656: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000071 16142 1727204127.32659: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003604", "end": "2024-09-24 14:55:27.274738", "rc": 0, "start": "2024-09-24 14:55:27.271134" } STDOUT: MII Polling Interval (ms): 110 16142 1727204127.32766: no more pending results, returning what we have 16142 1727204127.32770: results queue empty 16142 1727204127.32771: checking for any_errors_fatal 16142 1727204127.32776: done checking for any_errors_fatal 16142 1727204127.32777: checking for max_fail_percentage 16142 1727204127.32779: done checking for max_fail_percentage 16142 1727204127.32779: checking to see if all hosts have failed and the running result is not ok 16142 1727204127.32780: done checking to see if all hosts have failed 16142 1727204127.32781: getting the remaining hosts for this loop 16142 1727204127.32782: done getting the remaining hosts for this loop 16142 1727204127.32786: getting the next task for host managed-node2 16142 1727204127.32791: done getting next task for host managed-node2 16142 1727204127.32793: ^ task is: TASK: ** TEST check IPv4 16142 1727204127.32795: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204127.32798: getting variables 16142 1727204127.32800: in VariableManager get_vars() 16142 1727204127.32848: Calling all_inventory to load vars for managed-node2 16142 1727204127.32851: Calling groups_inventory to load vars for managed-node2 16142 1727204127.32852: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204127.32862: Calling all_plugins_play to load vars for managed-node2 16142 1727204127.32866: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204127.32868: Calling groups_plugins_play to load vars for managed-node2 16142 1727204127.34214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204127.35140: done with get_vars() 16142 1727204127.35159: done getting variables 16142 1727204127.35205: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.442) 0:00:26.529 ***** 16142 1727204127.35227: entering _queue_task() for managed-node2/command 16142 1727204127.35471: worker is 1 (out of 1 available) 16142 1727204127.35484: exiting _queue_task() for managed-node2/command 16142 1727204127.35496: done queuing things up, now waiting for results queue to drain 16142 1727204127.35497: waiting for pending results... 16142 1727204127.35681: running TaskExecutor() for managed-node2/TASK: ** TEST check IPv4 16142 1727204127.35744: in run() - task 0affcd87-79f5-fddd-f6c7-000000000072 16142 1727204127.35756: variable 'ansible_search_path' from source: unknown 16142 1727204127.35789: calling self._execute() 16142 1727204127.35868: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204127.35874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204127.35883: variable 'omit' from source: magic vars 16142 1727204127.36314: variable 'ansible_distribution_major_version' from source: facts 16142 1727204127.36317: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204127.36320: variable 'omit' from source: magic vars 16142 1727204127.36322: variable 'omit' from source: magic vars 16142 1727204127.36458: variable 'controller_device' from source: play vars 16142 1727204127.36461: variable 'omit' from source: magic vars 16142 1727204127.36497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204127.36505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204127.36518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204127.36542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204127.36556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204127.36589: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204127.36592: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204127.36595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204127.36725: Set connection var ansible_timeout to 10 16142 1727204127.36728: Set connection var ansible_connection to ssh 16142 1727204127.36737: Set connection var ansible_shell_type to sh 16142 1727204127.36745: Set connection var ansible_shell_executable to /bin/sh 16142 1727204127.36751: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204127.36759: Set connection var ansible_pipelining to False 16142 1727204127.36785: variable 'ansible_shell_executable' from source: unknown 16142 1727204127.36792: variable 'ansible_connection' from source: unknown 16142 1727204127.36795: variable 'ansible_module_compression' from source: unknown 16142 1727204127.36798: variable 'ansible_shell_type' from source: unknown 16142 1727204127.36800: variable 'ansible_shell_executable' from source: unknown 16142 1727204127.36802: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204127.36806: variable 'ansible_pipelining' from source: unknown 16142 1727204127.36809: variable 'ansible_timeout' from source: unknown 16142 1727204127.36814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204127.36975: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204127.36985: variable 'omit' from source: magic vars 16142 1727204127.36991: starting attempt loop 16142 1727204127.36994: running the handler 16142 1727204127.37014: _low_level_execute_command(): starting 16142 1727204127.37021: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204127.37797: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.37801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.37818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.37843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.37846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.37901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.37908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.37912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.37952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.39620: stdout chunk (state=3): >>>/root <<< 16142 1727204127.39723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.39801: stderr chunk (state=3): >>><<< 16142 1727204127.39803: stdout chunk (state=3): >>><<< 16142 1727204127.39847: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.39852: _low_level_execute_command(): starting 16142 1727204127.39854: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517 `" && echo ansible-tmp-1727204127.3981853-18170-16243498195517="` echo /root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517 `" ) && sleep 0' 16142 1727204127.40310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.40354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.40381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.40397: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.40399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.40450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.40537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.40540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.40553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.42466: stdout chunk (state=3): >>>ansible-tmp-1727204127.3981853-18170-16243498195517=/root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517 <<< 16142 1727204127.42592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.42646: stderr chunk (state=3): >>><<< 16142 1727204127.42649: stdout chunk (state=3): >>><<< 16142 1727204127.42670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204127.3981853-18170-16243498195517=/root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.42697: variable 'ansible_module_compression' from source: unknown 16142 1727204127.42741: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204127.42775: variable 'ansible_facts' from source: unknown 16142 1727204127.42837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517/AnsiballZ_command.py 16142 1727204127.42947: Sending initial data 16142 1727204127.42950: Sent initial data (155 bytes) 16142 1727204127.43636: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.43641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.43679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.43686: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.43692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.43698: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204127.43704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.43714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.43719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.43728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.43736: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.43804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.43808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.43813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.43867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.45653: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204127.45692: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204127.45734: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpkskve6c1 /root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517/AnsiballZ_command.py <<< 16142 1727204127.45771: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204127.46899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.47088: stderr chunk (state=3): >>><<< 16142 1727204127.47092: stdout chunk (state=3): >>><<< 16142 1727204127.47094: done transferring module to remote 16142 1727204127.47097: _low_level_execute_command(): starting 16142 1727204127.47103: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517/ /root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517/AnsiballZ_command.py && sleep 0' 16142 1727204127.47776: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.47786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.47797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.47819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.47859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.47868: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204127.47878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.47892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.47899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.47905: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204127.47914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.47926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.47942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.47948: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.47954: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204127.47966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.48044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.48060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.48074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.48141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.49962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.50059: stderr chunk (state=3): >>><<< 16142 1727204127.50066: stdout chunk (state=3): >>><<< 16142 1727204127.50095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.50098: _low_level_execute_command(): starting 16142 1727204127.50103: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517/AnsiballZ_command.py && sleep 0' 16142 1727204127.50799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.50808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.50819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.50834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.50883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.50890: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204127.50899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.50913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.50920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.50926: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204127.50934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.50948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.50968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.50975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.50982: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204127.50991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.51075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.51105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.51191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.51352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.65072: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.113/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 234sec preferred_lft 234sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:27.646096", "end": "2024-09-24 14:55:27.649840", "delta": "0:00:00.003744", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204127.66387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204127.66413: stderr chunk (state=3): >>><<< 16142 1727204127.66416: stdout chunk (state=3): >>><<< 16142 1727204127.66443: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.113/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 234sec preferred_lft 234sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:27.646096", "end": "2024-09-24 14:55:27.649840", "delta": "0:00:00.003744", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204127.66487: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204127.66496: _low_level_execute_command(): starting 16142 1727204127.66501: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204127.3981853-18170-16243498195517/ > /dev/null 2>&1 && sleep 0' 16142 1727204127.67222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.67243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.67254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.67271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.67310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.67319: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204127.67328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.67353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.67361: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.67372: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204127.67380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.67390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.67401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.67408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.67415: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204127.67425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.67510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.67528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.67544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.67627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.69481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.69580: stderr chunk (state=3): >>><<< 16142 1727204127.69584: stdout chunk (state=3): >>><<< 16142 1727204127.69610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.69616: handler run complete 16142 1727204127.69647: Evaluated conditional (False): False 16142 1727204127.69800: variable 'result' from source: set_fact 16142 1727204127.69816: Evaluated conditional ('192.0.2' in result.stdout): True 16142 1727204127.69827: attempt loop complete, returning result 16142 1727204127.69830: _execute() done 16142 1727204127.69833: dumping result to json 16142 1727204127.69841: done dumping result, returning 16142 1727204127.69850: done running TaskExecutor() for managed-node2/TASK: ** TEST check IPv4 [0affcd87-79f5-fddd-f6c7-000000000072] 16142 1727204127.69856: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000072 16142 1727204127.69970: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000072 16142 1727204127.69973: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003744", "end": "2024-09-24 14:55:27.649840", "rc": 0, "start": "2024-09-24 14:55:27.646096" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.113/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 234sec preferred_lft 234sec 16142 1727204127.70058: no more pending results, returning what we have 16142 1727204127.70062: results queue empty 16142 1727204127.70063: checking for any_errors_fatal 16142 1727204127.70077: done checking for any_errors_fatal 16142 1727204127.70077: checking for max_fail_percentage 16142 1727204127.70079: done checking for max_fail_percentage 16142 1727204127.70080: checking to see if all hosts have failed and the running result is not ok 16142 1727204127.70081: done checking to see if all hosts have failed 16142 1727204127.70081: getting the remaining hosts for this loop 16142 1727204127.70084: done getting the remaining hosts for this loop 16142 1727204127.70088: getting the next task for host managed-node2 16142 1727204127.70094: done getting next task for host managed-node2 16142 1727204127.70097: ^ task is: TASK: ** TEST check IPv6 16142 1727204127.70099: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204127.70103: getting variables 16142 1727204127.70105: in VariableManager get_vars() 16142 1727204127.70172: Calling all_inventory to load vars for managed-node2 16142 1727204127.70177: Calling groups_inventory to load vars for managed-node2 16142 1727204127.70179: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204127.70191: Calling all_plugins_play to load vars for managed-node2 16142 1727204127.70193: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204127.70196: Calling groups_plugins_play to load vars for managed-node2 16142 1727204127.72108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204127.73922: done with get_vars() 16142 1727204127.73952: done getting variables 16142 1727204127.74023: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.388) 0:00:26.917 ***** 16142 1727204127.74056: entering _queue_task() for managed-node2/command 16142 1727204127.74416: worker is 1 (out of 1 available) 16142 1727204127.74438: exiting _queue_task() for managed-node2/command 16142 1727204127.74451: done queuing things up, now waiting for results queue to drain 16142 1727204127.74452: waiting for pending results... 16142 1727204127.74756: running TaskExecutor() for managed-node2/TASK: ** TEST check IPv6 16142 1727204127.74837: in run() - task 0affcd87-79f5-fddd-f6c7-000000000073 16142 1727204127.74854: variable 'ansible_search_path' from source: unknown 16142 1727204127.74902: calling self._execute() 16142 1727204127.75012: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204127.75019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204127.75029: variable 'omit' from source: magic vars 16142 1727204127.75456: variable 'ansible_distribution_major_version' from source: facts 16142 1727204127.75471: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204127.75478: variable 'omit' from source: magic vars 16142 1727204127.75497: variable 'omit' from source: magic vars 16142 1727204127.75607: variable 'controller_device' from source: play vars 16142 1727204127.75632: variable 'omit' from source: magic vars 16142 1727204127.75683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204127.75717: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204127.75748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204127.75769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204127.75785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204127.75814: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204127.75817: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204127.75820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204127.75936: Set connection var ansible_timeout to 10 16142 1727204127.75940: Set connection var ansible_connection to ssh 16142 1727204127.75947: Set connection var ansible_shell_type to sh 16142 1727204127.75960: Set connection var ansible_shell_executable to /bin/sh 16142 1727204127.75967: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204127.75976: Set connection var ansible_pipelining to False 16142 1727204127.76005: variable 'ansible_shell_executable' from source: unknown 16142 1727204127.76009: variable 'ansible_connection' from source: unknown 16142 1727204127.76012: variable 'ansible_module_compression' from source: unknown 16142 1727204127.76015: variable 'ansible_shell_type' from source: unknown 16142 1727204127.76017: variable 'ansible_shell_executable' from source: unknown 16142 1727204127.76020: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204127.76022: variable 'ansible_pipelining' from source: unknown 16142 1727204127.76024: variable 'ansible_timeout' from source: unknown 16142 1727204127.76028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204127.76188: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204127.76201: variable 'omit' from source: magic vars 16142 1727204127.76207: starting attempt loop 16142 1727204127.76210: running the handler 16142 1727204127.76226: _low_level_execute_command(): starting 16142 1727204127.76233: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204127.77071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.77086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.77096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.77111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.77153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.77166: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204127.77177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.77193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.77201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.77208: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204127.77216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.77226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.77240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.77249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.77257: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204127.77279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.77356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.77381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.77393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.77471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.79147: stdout chunk (state=3): >>>/root <<< 16142 1727204127.79250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.79343: stderr chunk (state=3): >>><<< 16142 1727204127.79361: stdout chunk (state=3): >>><<< 16142 1727204127.79495: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.79499: _low_level_execute_command(): starting 16142 1727204127.79502: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991 `" && echo ansible-tmp-1727204127.7939858-18187-85426837784991="` echo /root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991 `" ) && sleep 0' 16142 1727204127.80216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.80231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.80248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.80270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.80321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.80334: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204127.80350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.80372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.80387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.80404: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204127.80418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.80434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.80460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.80492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.80516: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204127.80618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.80698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.80727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.80746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.80823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.82735: stdout chunk (state=3): >>>ansible-tmp-1727204127.7939858-18187-85426837784991=/root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991 <<< 16142 1727204127.82847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.82945: stderr chunk (state=3): >>><<< 16142 1727204127.82957: stdout chunk (state=3): >>><<< 16142 1727204127.83269: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204127.7939858-18187-85426837784991=/root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.83273: variable 'ansible_module_compression' from source: unknown 16142 1727204127.83276: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204127.83278: variable 'ansible_facts' from source: unknown 16142 1727204127.83280: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991/AnsiballZ_command.py 16142 1727204127.83369: Sending initial data 16142 1727204127.83381: Sent initial data (155 bytes) 16142 1727204127.84436: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.84451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.84468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.84497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.84540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.84553: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204127.84568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.84593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.84613: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.84626: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204127.84640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.84654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.84672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.84684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.84697: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204127.84719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.84797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.84829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.84847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.84919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.86718: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204127.86759: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204127.86790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp1m86jlpr /root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991/AnsiballZ_command.py <<< 16142 1727204127.86804: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204127.88058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.88212: stderr chunk (state=3): >>><<< 16142 1727204127.88215: stdout chunk (state=3): >>><<< 16142 1727204127.88218: done transferring module to remote 16142 1727204127.88224: _low_level_execute_command(): starting 16142 1727204127.88227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991/ /root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991/AnsiballZ_command.py && sleep 0' 16142 1727204127.88925: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.88941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.88958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.88986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.89030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.89050: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204127.89069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.89095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204127.89109: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204127.89120: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204127.89132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.89146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.89162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.89177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204127.89193: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204127.89213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.89287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.89318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.89336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.89407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204127.91204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204127.91320: stderr chunk (state=3): >>><<< 16142 1727204127.91331: stdout chunk (state=3): >>><<< 16142 1727204127.91459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204127.91463: _low_level_execute_command(): starting 16142 1727204127.91467: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991/AnsiballZ_command.py && sleep 0' 16142 1727204127.92095: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204127.92121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204127.92139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.92183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204127.92591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.92594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204127.92597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204127.92654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204127.92677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204127.92684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204127.92759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204128.06567: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1df/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::ee44:17b3:b183:75ee/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::57f3:e8d4:6a2e:4f55/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:28.061297", "end": "2024-09-24 14:55:28.064923", "delta": "0:00:00.003626", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204128.07897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204128.07901: stderr chunk (state=3): >>><<< 16142 1727204128.07907: stdout chunk (state=3): >>><<< 16142 1727204128.07931: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1df/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::ee44:17b3:b183:75ee/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::57f3:e8d4:6a2e:4f55/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:28.061297", "end": "2024-09-24 14:55:28.064923", "delta": "0:00:00.003626", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204128.07985: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204128.07993: _low_level_execute_command(): starting 16142 1727204128.07998: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204127.7939858-18187-85426837784991/ > /dev/null 2>&1 && sleep 0' 16142 1727204128.08798: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.08804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.08850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204128.08859: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.08911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204128.08921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204128.08979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204128.10959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204128.10963: stderr chunk (state=3): >>><<< 16142 1727204128.10968: stdout chunk (state=3): >>><<< 16142 1727204128.10970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204128.10973: handler run complete 16142 1727204128.10976: Evaluated conditional (False): False 16142 1727204128.11106: variable 'result' from source: set_fact 16142 1727204128.11124: Evaluated conditional ('2001' in result.stdout): True 16142 1727204128.11138: attempt loop complete, returning result 16142 1727204128.11141: _execute() done 16142 1727204128.11144: dumping result to json 16142 1727204128.11148: done dumping result, returning 16142 1727204128.11158: done running TaskExecutor() for managed-node2/TASK: ** TEST check IPv6 [0affcd87-79f5-fddd-f6c7-000000000073] 16142 1727204128.11163: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000073 16142 1727204128.11285: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000073 16142 1727204128.11288: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003626", "end": "2024-09-24 14:55:28.064923", "rc": 0, "start": "2024-09-24 14:55:28.061297" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1df/128 scope global dynamic noprefixroute valid_lft 235sec preferred_lft 235sec inet6 2001:db8::ee44:17b3:b183:75ee/64 scope global dynamic noprefixroute valid_lft 1796sec preferred_lft 1796sec inet6 fe80::57f3:e8d4:6a2e:4f55/64 scope link noprefixroute valid_lft forever preferred_lft forever 16142 1727204128.11375: no more pending results, returning what we have 16142 1727204128.11380: results queue empty 16142 1727204128.11381: checking for any_errors_fatal 16142 1727204128.11388: done checking for any_errors_fatal 16142 1727204128.11389: checking for max_fail_percentage 16142 1727204128.11391: done checking for max_fail_percentage 16142 1727204128.11392: checking to see if all hosts have failed and the running result is not ok 16142 1727204128.11393: done checking to see if all hosts have failed 16142 1727204128.11393: getting the remaining hosts for this loop 16142 1727204128.11395: done getting the remaining hosts for this loop 16142 1727204128.11399: getting the next task for host managed-node2 16142 1727204128.11407: done getting next task for host managed-node2 16142 1727204128.11413: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204128.11417: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204128.11436: getting variables 16142 1727204128.11439: in VariableManager get_vars() 16142 1727204128.11502: Calling all_inventory to load vars for managed-node2 16142 1727204128.11505: Calling groups_inventory to load vars for managed-node2 16142 1727204128.11508: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204128.11519: Calling all_plugins_play to load vars for managed-node2 16142 1727204128.11521: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204128.11525: Calling groups_plugins_play to load vars for managed-node2 16142 1727204128.13216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204128.14144: done with get_vars() 16142 1727204128.14162: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.401) 0:00:27.319 ***** 16142 1727204128.14241: entering _queue_task() for managed-node2/include_tasks 16142 1727204128.14481: worker is 1 (out of 1 available) 16142 1727204128.14496: exiting _queue_task() for managed-node2/include_tasks 16142 1727204128.14508: done queuing things up, now waiting for results queue to drain 16142 1727204128.14509: waiting for pending results... 16142 1727204128.15044: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204128.15050: in run() - task 0affcd87-79f5-fddd-f6c7-00000000007b 16142 1727204128.15053: variable 'ansible_search_path' from source: unknown 16142 1727204128.15057: variable 'ansible_search_path' from source: unknown 16142 1727204128.15060: calling self._execute() 16142 1727204128.15070: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204128.15073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204128.15349: variable 'omit' from source: magic vars 16142 1727204128.15698: variable 'ansible_distribution_major_version' from source: facts 16142 1727204128.15701: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204128.15703: _execute() done 16142 1727204128.15705: dumping result to json 16142 1727204128.15707: done dumping result, returning 16142 1727204128.15709: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-fddd-f6c7-00000000007b] 16142 1727204128.15710: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007b 16142 1727204128.15827: no more pending results, returning what we have 16142 1727204128.15832: in VariableManager get_vars() 16142 1727204128.15900: Calling all_inventory to load vars for managed-node2 16142 1727204128.15903: Calling groups_inventory to load vars for managed-node2 16142 1727204128.15905: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204128.15915: Calling all_plugins_play to load vars for managed-node2 16142 1727204128.15917: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204128.15925: Calling groups_plugins_play to load vars for managed-node2 16142 1727204128.16459: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007b 16142 1727204128.16465: WORKER PROCESS EXITING 16142 1727204128.17388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204128.18321: done with get_vars() 16142 1727204128.18339: variable 'ansible_search_path' from source: unknown 16142 1727204128.18340: variable 'ansible_search_path' from source: unknown 16142 1727204128.18371: we have included files to process 16142 1727204128.18372: generating all_blocks data 16142 1727204128.18374: done generating all_blocks data 16142 1727204128.18378: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204128.18379: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204128.18380: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204128.18777: done processing included file 16142 1727204128.18779: iterating over new_blocks loaded from include file 16142 1727204128.18780: in VariableManager get_vars() 16142 1727204128.18803: done with get_vars() 16142 1727204128.18804: filtering new block on tags 16142 1727204128.18817: done filtering new block on tags 16142 1727204128.18819: in VariableManager get_vars() 16142 1727204128.18837: done with get_vars() 16142 1727204128.18838: filtering new block on tags 16142 1727204128.18862: done filtering new block on tags 16142 1727204128.18866: in VariableManager get_vars() 16142 1727204128.18917: done with get_vars() 16142 1727204128.18919: filtering new block on tags 16142 1727204128.18936: done filtering new block on tags 16142 1727204128.18938: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 16142 1727204128.18943: extending task lists for all hosts with included blocks 16142 1727204128.20058: done extending task lists 16142 1727204128.20059: done processing included files 16142 1727204128.20060: results queue empty 16142 1727204128.20060: checking for any_errors_fatal 16142 1727204128.20063: done checking for any_errors_fatal 16142 1727204128.20065: checking for max_fail_percentage 16142 1727204128.20066: done checking for max_fail_percentage 16142 1727204128.20067: checking to see if all hosts have failed and the running result is not ok 16142 1727204128.20067: done checking to see if all hosts have failed 16142 1727204128.20068: getting the remaining hosts for this loop 16142 1727204128.20069: done getting the remaining hosts for this loop 16142 1727204128.20071: getting the next task for host managed-node2 16142 1727204128.20080: done getting next task for host managed-node2 16142 1727204128.20083: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204128.20086: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204128.20111: getting variables 16142 1727204128.20113: in VariableManager get_vars() 16142 1727204128.20181: Calling all_inventory to load vars for managed-node2 16142 1727204128.20186: Calling groups_inventory to load vars for managed-node2 16142 1727204128.20190: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204128.20196: Calling all_plugins_play to load vars for managed-node2 16142 1727204128.20199: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204128.20202: Calling groups_plugins_play to load vars for managed-node2 16142 1727204128.21060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204128.21993: done with get_vars() 16142 1727204128.22009: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.078) 0:00:27.397 ***** 16142 1727204128.22073: entering _queue_task() for managed-node2/setup 16142 1727204128.22322: worker is 1 (out of 1 available) 16142 1727204128.22339: exiting _queue_task() for managed-node2/setup 16142 1727204128.22351: done queuing things up, now waiting for results queue to drain 16142 1727204128.22352: waiting for pending results... 16142 1727204128.22584: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204128.22738: in run() - task 0affcd87-79f5-fddd-f6c7-0000000006c5 16142 1727204128.22757: variable 'ansible_search_path' from source: unknown 16142 1727204128.22768: variable 'ansible_search_path' from source: unknown 16142 1727204128.22817: calling self._execute() 16142 1727204128.22922: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204128.22937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204128.22954: variable 'omit' from source: magic vars 16142 1727204128.23391: variable 'ansible_distribution_major_version' from source: facts 16142 1727204128.23412: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204128.23667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204128.25517: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204128.25580: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204128.25608: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204128.25643: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204128.25666: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204128.25725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204128.25754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204128.25779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204128.25803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204128.25813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204128.25856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204128.25875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204128.25892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204128.25917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204128.25927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204128.26041: variable '__network_required_facts' from source: role '' defaults 16142 1727204128.26051: variable 'ansible_facts' from source: unknown 16142 1727204128.26611: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 16142 1727204128.26615: when evaluation is False, skipping this task 16142 1727204128.26618: _execute() done 16142 1727204128.26622: dumping result to json 16142 1727204128.26624: done dumping result, returning 16142 1727204128.26629: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-fddd-f6c7-0000000006c5] 16142 1727204128.26638: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006c5 16142 1727204128.26723: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006c5 16142 1727204128.26726: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204128.26796: no more pending results, returning what we have 16142 1727204128.26801: results queue empty 16142 1727204128.26802: checking for any_errors_fatal 16142 1727204128.26804: done checking for any_errors_fatal 16142 1727204128.26805: checking for max_fail_percentage 16142 1727204128.26806: done checking for max_fail_percentage 16142 1727204128.26807: checking to see if all hosts have failed and the running result is not ok 16142 1727204128.26808: done checking to see if all hosts have failed 16142 1727204128.26808: getting the remaining hosts for this loop 16142 1727204128.26810: done getting the remaining hosts for this loop 16142 1727204128.26815: getting the next task for host managed-node2 16142 1727204128.26825: done getting next task for host managed-node2 16142 1727204128.26829: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204128.26837: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204128.26858: getting variables 16142 1727204128.26860: in VariableManager get_vars() 16142 1727204128.26919: Calling all_inventory to load vars for managed-node2 16142 1727204128.26922: Calling groups_inventory to load vars for managed-node2 16142 1727204128.26924: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204128.26938: Calling all_plugins_play to load vars for managed-node2 16142 1727204128.26941: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204128.26945: Calling groups_plugins_play to load vars for managed-node2 16142 1727204128.28589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204128.30370: done with get_vars() 16142 1727204128.30397: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.084) 0:00:27.481 ***** 16142 1727204128.30510: entering _queue_task() for managed-node2/stat 16142 1727204128.30910: worker is 1 (out of 1 available) 16142 1727204128.30922: exiting _queue_task() for managed-node2/stat 16142 1727204128.30939: done queuing things up, now waiting for results queue to drain 16142 1727204128.30940: waiting for pending results... 16142 1727204128.31289: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204128.31489: in run() - task 0affcd87-79f5-fddd-f6c7-0000000006c7 16142 1727204128.31513: variable 'ansible_search_path' from source: unknown 16142 1727204128.31522: variable 'ansible_search_path' from source: unknown 16142 1727204128.31577: calling self._execute() 16142 1727204128.31686: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204128.31697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204128.31718: variable 'omit' from source: magic vars 16142 1727204128.32177: variable 'ansible_distribution_major_version' from source: facts 16142 1727204128.32206: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204128.32399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204128.32722: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204128.32783: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204128.32825: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204128.32875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204128.32978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204128.33007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204128.33044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204128.33087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204128.33197: variable '__network_is_ostree' from source: set_fact 16142 1727204128.33207: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204128.33213: when evaluation is False, skipping this task 16142 1727204128.33219: _execute() done 16142 1727204128.33224: dumping result to json 16142 1727204128.33230: done dumping result, returning 16142 1727204128.33245: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-fddd-f6c7-0000000006c7] 16142 1727204128.33254: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006c7 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204128.33470: no more pending results, returning what we have 16142 1727204128.33474: results queue empty 16142 1727204128.33475: checking for any_errors_fatal 16142 1727204128.33483: done checking for any_errors_fatal 16142 1727204128.33483: checking for max_fail_percentage 16142 1727204128.33485: done checking for max_fail_percentage 16142 1727204128.33486: checking to see if all hosts have failed and the running result is not ok 16142 1727204128.33487: done checking to see if all hosts have failed 16142 1727204128.33487: getting the remaining hosts for this loop 16142 1727204128.33489: done getting the remaining hosts for this loop 16142 1727204128.33493: getting the next task for host managed-node2 16142 1727204128.33499: done getting next task for host managed-node2 16142 1727204128.33504: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204128.33508: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204128.33527: getting variables 16142 1727204128.33529: in VariableManager get_vars() 16142 1727204128.33595: Calling all_inventory to load vars for managed-node2 16142 1727204128.33598: Calling groups_inventory to load vars for managed-node2 16142 1727204128.33601: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204128.33612: Calling all_plugins_play to load vars for managed-node2 16142 1727204128.33615: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204128.33619: Calling groups_plugins_play to load vars for managed-node2 16142 1727204128.34635: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006c7 16142 1727204128.34639: WORKER PROCESS EXITING 16142 1727204128.35524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204128.37308: done with get_vars() 16142 1727204128.37333: done getting variables 16142 1727204128.37406: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.069) 0:00:27.551 ***** 16142 1727204128.37449: entering _queue_task() for managed-node2/set_fact 16142 1727204128.37831: worker is 1 (out of 1 available) 16142 1727204128.37846: exiting _queue_task() for managed-node2/set_fact 16142 1727204128.37859: done queuing things up, now waiting for results queue to drain 16142 1727204128.37860: waiting for pending results... 16142 1727204128.38193: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204128.38389: in run() - task 0affcd87-79f5-fddd-f6c7-0000000006c8 16142 1727204128.38424: variable 'ansible_search_path' from source: unknown 16142 1727204128.38434: variable 'ansible_search_path' from source: unknown 16142 1727204128.38488: calling self._execute() 16142 1727204128.38590: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204128.38601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204128.38615: variable 'omit' from source: magic vars 16142 1727204128.39730: variable 'ansible_distribution_major_version' from source: facts 16142 1727204128.39749: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204128.39954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204128.40276: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204128.40328: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204128.40385: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204128.40425: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204128.40533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204128.40578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204128.40610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204128.40643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204128.40753: variable '__network_is_ostree' from source: set_fact 16142 1727204128.40769: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204128.40787: when evaluation is False, skipping this task 16142 1727204128.40799: _execute() done 16142 1727204128.40808: dumping result to json 16142 1727204128.40819: done dumping result, returning 16142 1727204128.40832: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-fddd-f6c7-0000000006c8] 16142 1727204128.40843: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006c8 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204128.41000: no more pending results, returning what we have 16142 1727204128.41005: results queue empty 16142 1727204128.41006: checking for any_errors_fatal 16142 1727204128.41013: done checking for any_errors_fatal 16142 1727204128.41014: checking for max_fail_percentage 16142 1727204128.41016: done checking for max_fail_percentage 16142 1727204128.41017: checking to see if all hosts have failed and the running result is not ok 16142 1727204128.41018: done checking to see if all hosts have failed 16142 1727204128.41019: getting the remaining hosts for this loop 16142 1727204128.41021: done getting the remaining hosts for this loop 16142 1727204128.41025: getting the next task for host managed-node2 16142 1727204128.41037: done getting next task for host managed-node2 16142 1727204128.41042: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204128.41047: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204128.41078: getting variables 16142 1727204128.41081: in VariableManager get_vars() 16142 1727204128.41142: Calling all_inventory to load vars for managed-node2 16142 1727204128.41145: Calling groups_inventory to load vars for managed-node2 16142 1727204128.41147: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204128.41159: Calling all_plugins_play to load vars for managed-node2 16142 1727204128.41162: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204128.41167: Calling groups_plugins_play to load vars for managed-node2 16142 1727204128.42170: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006c8 16142 1727204128.42174: WORKER PROCESS EXITING 16142 1727204128.43413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204128.44352: done with get_vars() 16142 1727204128.44375: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.070) 0:00:27.621 ***** 16142 1727204128.44453: entering _queue_task() for managed-node2/service_facts 16142 1727204128.44738: worker is 1 (out of 1 available) 16142 1727204128.44752: exiting _queue_task() for managed-node2/service_facts 16142 1727204128.45061: done queuing things up, now waiting for results queue to drain 16142 1727204128.45063: waiting for pending results... 16142 1727204128.45088: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204128.45217: in run() - task 0affcd87-79f5-fddd-f6c7-0000000006ca 16142 1727204128.45248: variable 'ansible_search_path' from source: unknown 16142 1727204128.45258: variable 'ansible_search_path' from source: unknown 16142 1727204128.45306: calling self._execute() 16142 1727204128.45419: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204128.45431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204128.45455: variable 'omit' from source: magic vars 16142 1727204128.46098: variable 'ansible_distribution_major_version' from source: facts 16142 1727204128.46118: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204128.46131: variable 'omit' from source: magic vars 16142 1727204128.46248: variable 'omit' from source: magic vars 16142 1727204128.46298: variable 'omit' from source: magic vars 16142 1727204128.46347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204128.46398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204128.46432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204128.46455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204128.46474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204128.46516: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204128.46524: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204128.46533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204128.46647: Set connection var ansible_timeout to 10 16142 1727204128.46655: Set connection var ansible_connection to ssh 16142 1727204128.46666: Set connection var ansible_shell_type to sh 16142 1727204128.46675: Set connection var ansible_shell_executable to /bin/sh 16142 1727204128.46684: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204128.46701: Set connection var ansible_pipelining to False 16142 1727204128.46727: variable 'ansible_shell_executable' from source: unknown 16142 1727204128.46735: variable 'ansible_connection' from source: unknown 16142 1727204128.46742: variable 'ansible_module_compression' from source: unknown 16142 1727204128.46748: variable 'ansible_shell_type' from source: unknown 16142 1727204128.46754: variable 'ansible_shell_executable' from source: unknown 16142 1727204128.46760: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204128.46770: variable 'ansible_pipelining' from source: unknown 16142 1727204128.46777: variable 'ansible_timeout' from source: unknown 16142 1727204128.46784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204128.47056: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204128.47079: variable 'omit' from source: magic vars 16142 1727204128.47083: starting attempt loop 16142 1727204128.47086: running the handler 16142 1727204128.47098: _low_level_execute_command(): starting 16142 1727204128.47105: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204128.47639: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.47649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.47681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.47695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.47706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.47757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204128.47768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204128.47828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204128.49610: stdout chunk (state=3): >>>/root <<< 16142 1727204128.49951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204128.49955: stdout chunk (state=3): >>><<< 16142 1727204128.49962: stderr chunk (state=3): >>><<< 16142 1727204128.49986: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204128.50001: _low_level_execute_command(): starting 16142 1727204128.50007: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980 `" && echo ansible-tmp-1727204128.49986-18225-114385514420980="` echo /root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980 `" ) && sleep 0' 16142 1727204128.50903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204128.50911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204128.50927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.50941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.51042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204128.51056: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204128.51059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.51090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204128.51093: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204128.51096: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204128.51127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.51137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204128.51140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.51253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204128.51277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204128.51348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204128.53250: stdout chunk (state=3): >>>ansible-tmp-1727204128.49986-18225-114385514420980=/root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980 <<< 16142 1727204128.53391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204128.53451: stderr chunk (state=3): >>><<< 16142 1727204128.53454: stdout chunk (state=3): >>><<< 16142 1727204128.53466: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204128.49986-18225-114385514420980=/root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204128.53778: variable 'ansible_module_compression' from source: unknown 16142 1727204128.53781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 16142 1727204128.53784: variable 'ansible_facts' from source: unknown 16142 1727204128.53786: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980/AnsiballZ_service_facts.py 16142 1727204128.53939: Sending initial data 16142 1727204128.53948: Sent initial data (160 bytes) 16142 1727204128.55092: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204128.55109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204128.55123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.55145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.55199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204128.55217: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204128.55231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.55251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204128.55263: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204128.55275: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204128.55294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204128.55311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.55337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.55350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204128.55362: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204128.55382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.55449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204128.55453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204128.55508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204128.57226: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204128.57271: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204128.57299: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpokbv13qo /root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980/AnsiballZ_service_facts.py <<< 16142 1727204128.57349: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204128.58330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204128.58447: stderr chunk (state=3): >>><<< 16142 1727204128.58451: stdout chunk (state=3): >>><<< 16142 1727204128.58468: done transferring module to remote 16142 1727204128.58477: _low_level_execute_command(): starting 16142 1727204128.58483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980/ /root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980/AnsiballZ_service_facts.py && sleep 0' 16142 1727204128.58950: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204128.58960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204128.58968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.58980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.59010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204128.59018: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204128.59026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.59041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204128.59050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204128.59055: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204128.59060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.59073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.59078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204128.59083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.59132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204128.59156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204128.59162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204128.59213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204128.60926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204128.60991: stderr chunk (state=3): >>><<< 16142 1727204128.60994: stdout chunk (state=3): >>><<< 16142 1727204128.61010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204128.61013: _low_level_execute_command(): starting 16142 1727204128.61019: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980/AnsiballZ_service_facts.py && sleep 0' 16142 1727204128.61770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.61780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.61791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204128.61805: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.61809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204128.61816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204128.61821: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204128.61830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204128.61840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204128.61846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204128.61920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204128.61922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204128.61924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204128.61962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204129.91842: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 16142 1727204129.91881: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 16142 1727204129.91907: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 16142 1727204129.91917: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 16142 1727204129.93230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204129.93298: stderr chunk (state=3): >>><<< 16142 1727204129.93303: stdout chunk (state=3): >>><<< 16142 1727204129.93331: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204129.93748: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204129.93762: _low_level_execute_command(): starting 16142 1727204129.93767: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204128.49986-18225-114385514420980/ > /dev/null 2>&1 && sleep 0' 16142 1727204129.94408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204129.94418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204129.94429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204129.94443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204129.94486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204129.94497: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204129.94500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204129.94515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204129.94522: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204129.94528: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204129.94538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204129.94545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204129.94557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204129.94566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204129.94574: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204129.94583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204129.94659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204129.94676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204129.94689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204129.94759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204129.96667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204129.96673: stdout chunk (state=3): >>><<< 16142 1727204129.96675: stderr chunk (state=3): >>><<< 16142 1727204129.96999: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204129.97005: handler run complete 16142 1727204129.97008: variable 'ansible_facts' from source: unknown 16142 1727204129.97118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204129.97606: variable 'ansible_facts' from source: unknown 16142 1727204129.97755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204129.97960: attempt loop complete, returning result 16142 1727204129.97974: _execute() done 16142 1727204129.97981: dumping result to json 16142 1727204129.98045: done dumping result, returning 16142 1727204129.98071: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-fddd-f6c7-0000000006ca] 16142 1727204129.98082: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006ca ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204129.98937: no more pending results, returning what we have 16142 1727204129.98940: results queue empty 16142 1727204129.98941: checking for any_errors_fatal 16142 1727204129.98947: done checking for any_errors_fatal 16142 1727204129.98948: checking for max_fail_percentage 16142 1727204129.98950: done checking for max_fail_percentage 16142 1727204129.98951: checking to see if all hosts have failed and the running result is not ok 16142 1727204129.98952: done checking to see if all hosts have failed 16142 1727204129.98953: getting the remaining hosts for this loop 16142 1727204129.98954: done getting the remaining hosts for this loop 16142 1727204129.98958: getting the next task for host managed-node2 16142 1727204129.98967: done getting next task for host managed-node2 16142 1727204129.98971: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204129.98976: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204129.98991: getting variables 16142 1727204129.98993: in VariableManager get_vars() 16142 1727204129.99051: Calling all_inventory to load vars for managed-node2 16142 1727204129.99054: Calling groups_inventory to load vars for managed-node2 16142 1727204129.99057: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204129.99069: Calling all_plugins_play to load vars for managed-node2 16142 1727204129.99071: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204129.99074: Calling groups_plugins_play to load vars for managed-node2 16142 1727204130.00154: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006ca 16142 1727204130.00162: WORKER PROCESS EXITING 16142 1727204130.01132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204130.02888: done with get_vars() 16142 1727204130.02922: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:30 -0400 (0:00:01.585) 0:00:29.207 ***** 16142 1727204130.03044: entering _queue_task() for managed-node2/package_facts 16142 1727204130.03414: worker is 1 (out of 1 available) 16142 1727204130.03436: exiting _queue_task() for managed-node2/package_facts 16142 1727204130.03451: done queuing things up, now waiting for results queue to drain 16142 1727204130.03452: waiting for pending results... 16142 1727204130.03781: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204130.03973: in run() - task 0affcd87-79f5-fddd-f6c7-0000000006cb 16142 1727204130.04000: variable 'ansible_search_path' from source: unknown 16142 1727204130.04014: variable 'ansible_search_path' from source: unknown 16142 1727204130.04059: calling self._execute() 16142 1727204130.04176: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204130.04188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204130.04210: variable 'omit' from source: magic vars 16142 1727204130.04648: variable 'ansible_distribution_major_version' from source: facts 16142 1727204130.04673: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204130.04684: variable 'omit' from source: magic vars 16142 1727204130.04780: variable 'omit' from source: magic vars 16142 1727204130.04818: variable 'omit' from source: magic vars 16142 1727204130.04871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204130.04917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204130.04948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204130.04977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204130.04997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204130.05039: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204130.05043: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204130.05045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204130.05123: Set connection var ansible_timeout to 10 16142 1727204130.05126: Set connection var ansible_connection to ssh 16142 1727204130.05129: Set connection var ansible_shell_type to sh 16142 1727204130.05137: Set connection var ansible_shell_executable to /bin/sh 16142 1727204130.05140: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204130.05146: Set connection var ansible_pipelining to False 16142 1727204130.05165: variable 'ansible_shell_executable' from source: unknown 16142 1727204130.05168: variable 'ansible_connection' from source: unknown 16142 1727204130.05171: variable 'ansible_module_compression' from source: unknown 16142 1727204130.05173: variable 'ansible_shell_type' from source: unknown 16142 1727204130.05177: variable 'ansible_shell_executable' from source: unknown 16142 1727204130.05179: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204130.05182: variable 'ansible_pipelining' from source: unknown 16142 1727204130.05184: variable 'ansible_timeout' from source: unknown 16142 1727204130.05283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204130.05415: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204130.05425: variable 'omit' from source: magic vars 16142 1727204130.05431: starting attempt loop 16142 1727204130.05436: running the handler 16142 1727204130.05447: _low_level_execute_command(): starting 16142 1727204130.05455: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204130.06199: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204130.06212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204130.06224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.06240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204130.06278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204130.06286: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204130.06297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.06311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204130.06320: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204130.06327: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204130.06338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204130.06345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.06358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204130.06372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204130.06375: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204130.06387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.06476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204130.06491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204130.06493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204130.06614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204130.08146: stdout chunk (state=3): >>>/root <<< 16142 1727204130.08283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204130.08329: stderr chunk (state=3): >>><<< 16142 1727204130.08332: stdout chunk (state=3): >>><<< 16142 1727204130.08355: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204130.08371: _low_level_execute_command(): starting 16142 1727204130.08377: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393 `" && echo ansible-tmp-1727204130.0835552-18427-102063539526393="` echo /root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393 `" ) && sleep 0' 16142 1727204130.08896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.08901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204130.08949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.08953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.08955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.09002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204130.09009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204130.09016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204130.09075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204130.10899: stdout chunk (state=3): >>>ansible-tmp-1727204130.0835552-18427-102063539526393=/root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393 <<< 16142 1727204130.11016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204130.11075: stderr chunk (state=3): >>><<< 16142 1727204130.11078: stdout chunk (state=3): >>><<< 16142 1727204130.11093: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204130.0835552-18427-102063539526393=/root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204130.11132: variable 'ansible_module_compression' from source: unknown 16142 1727204130.11179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 16142 1727204130.11230: variable 'ansible_facts' from source: unknown 16142 1727204130.11368: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393/AnsiballZ_package_facts.py 16142 1727204130.11490: Sending initial data 16142 1727204130.11494: Sent initial data (162 bytes) 16142 1727204130.12192: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204130.12195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204130.12230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.12233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.12235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.12295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204130.12299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204130.12303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204130.12340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204130.14030: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204130.14061: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204130.14097: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpvbgu349o /root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393/AnsiballZ_package_facts.py <<< 16142 1727204130.14130: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204130.16101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204130.16368: stderr chunk (state=3): >>><<< 16142 1727204130.16371: stdout chunk (state=3): >>><<< 16142 1727204130.16373: done transferring module to remote 16142 1727204130.16375: _low_level_execute_command(): starting 16142 1727204130.16377: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393/ /root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393/AnsiballZ_package_facts.py && sleep 0' 16142 1727204130.16882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204130.16886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204130.16942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.16946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.16953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204130.16955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.17011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204130.17014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204130.17020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204130.17061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204130.18785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204130.18842: stderr chunk (state=3): >>><<< 16142 1727204130.18846: stdout chunk (state=3): >>><<< 16142 1727204130.18862: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204130.18867: _low_level_execute_command(): starting 16142 1727204130.18871: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393/AnsiballZ_package_facts.py && sleep 0' 16142 1727204130.19345: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204130.19348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.19358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204130.19393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.19407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.19466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204130.19470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204130.19538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204130.66056: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 16142 1727204130.66148: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 16142 1727204130.66155: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 16142 1727204130.66179: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 16142 1727204130.66186: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 16142 1727204130.66189: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 16142 1727204130.66192: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 16142 1727204130.66195: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 16142 1727204130.66198: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 16142 1727204130.66200: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 16142 1727204130.66204: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 16142 1727204130.66207: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 16142 1727204130.66213: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 16142 1727204130.67797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204130.67801: stderr chunk (state=3): >>><<< 16142 1727204130.67803: stdout chunk (state=3): >>><<< 16142 1727204130.68087: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204130.70632: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204130.70693: _low_level_execute_command(): starting 16142 1727204130.70705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204130.0835552-18427-102063539526393/ > /dev/null 2>&1 && sleep 0' 16142 1727204130.71668: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204130.71687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204130.71704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.71729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204130.71789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204130.71803: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204130.71817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.71832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204130.71850: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204130.71860: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204130.71872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204130.71883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204130.71896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204130.71906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204130.71914: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204130.71924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204130.72009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204130.72031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204130.72047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204130.72114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204130.73994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204130.74095: stderr chunk (state=3): >>><<< 16142 1727204130.74099: stdout chunk (state=3): >>><<< 16142 1727204130.74274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204130.74279: handler run complete 16142 1727204130.75748: variable 'ansible_facts' from source: unknown 16142 1727204130.76960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204130.80541: variable 'ansible_facts' from source: unknown 16142 1727204130.88031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204130.88987: attempt loop complete, returning result 16142 1727204130.89014: _execute() done 16142 1727204130.89028: dumping result to json 16142 1727204130.89290: done dumping result, returning 16142 1727204130.89305: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-fddd-f6c7-0000000006cb] 16142 1727204130.89315: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006cb ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204130.91608: no more pending results, returning what we have 16142 1727204130.91613: results queue empty 16142 1727204130.91615: checking for any_errors_fatal 16142 1727204130.91623: done checking for any_errors_fatal 16142 1727204130.91624: checking for max_fail_percentage 16142 1727204130.91625: done checking for max_fail_percentage 16142 1727204130.91626: checking to see if all hosts have failed and the running result is not ok 16142 1727204130.91627: done checking to see if all hosts have failed 16142 1727204130.91628: getting the remaining hosts for this loop 16142 1727204130.91629: done getting the remaining hosts for this loop 16142 1727204130.91635: getting the next task for host managed-node2 16142 1727204130.91644: done getting next task for host managed-node2 16142 1727204130.91648: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204130.91651: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204130.91662: getting variables 16142 1727204130.91665: in VariableManager get_vars() 16142 1727204130.91713: Calling all_inventory to load vars for managed-node2 16142 1727204130.91716: Calling groups_inventory to load vars for managed-node2 16142 1727204130.91718: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204130.91728: Calling all_plugins_play to load vars for managed-node2 16142 1727204130.91731: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204130.91737: Calling groups_plugins_play to load vars for managed-node2 16142 1727204130.92752: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000006cb 16142 1727204130.92757: WORKER PROCESS EXITING 16142 1727204131.02846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.06383: done with get_vars() 16142 1727204131.06604: done getting variables 16142 1727204131.06657: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:31 -0400 (0:00:01.036) 0:00:30.243 ***** 16142 1727204131.06691: entering _queue_task() for managed-node2/debug 16142 1727204131.07487: worker is 1 (out of 1 available) 16142 1727204131.07501: exiting _queue_task() for managed-node2/debug 16142 1727204131.07513: done queuing things up, now waiting for results queue to drain 16142 1727204131.07515: waiting for pending results... 16142 1727204131.08308: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204131.08613: in run() - task 0affcd87-79f5-fddd-f6c7-00000000007c 16142 1727204131.09193: variable 'ansible_search_path' from source: unknown 16142 1727204131.09204: variable 'ansible_search_path' from source: unknown 16142 1727204131.09255: calling self._execute() 16142 1727204131.09675: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.09691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.09711: variable 'omit' from source: magic vars 16142 1727204131.10737: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.10759: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204131.10883: variable 'omit' from source: magic vars 16142 1727204131.10958: variable 'omit' from source: magic vars 16142 1727204131.11382: variable 'network_provider' from source: set_fact 16142 1727204131.11408: variable 'omit' from source: magic vars 16142 1727204131.11459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204131.11606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204131.11638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204131.11661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204131.11763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204131.11804: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204131.11978: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.11988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.12105: Set connection var ansible_timeout to 10 16142 1727204131.12112: Set connection var ansible_connection to ssh 16142 1727204131.12122: Set connection var ansible_shell_type to sh 16142 1727204131.12133: Set connection var ansible_shell_executable to /bin/sh 16142 1727204131.12148: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204131.12160: Set connection var ansible_pipelining to False 16142 1727204131.12203: variable 'ansible_shell_executable' from source: unknown 16142 1727204131.12377: variable 'ansible_connection' from source: unknown 16142 1727204131.12385: variable 'ansible_module_compression' from source: unknown 16142 1727204131.12395: variable 'ansible_shell_type' from source: unknown 16142 1727204131.12402: variable 'ansible_shell_executable' from source: unknown 16142 1727204131.12409: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.12419: variable 'ansible_pipelining' from source: unknown 16142 1727204131.12427: variable 'ansible_timeout' from source: unknown 16142 1727204131.12438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.12793: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204131.12810: variable 'omit' from source: magic vars 16142 1727204131.12822: starting attempt loop 16142 1727204131.12830: running the handler 16142 1727204131.12883: handler run complete 16142 1727204131.12904: attempt loop complete, returning result 16142 1727204131.12912: _execute() done 16142 1727204131.12918: dumping result to json 16142 1727204131.12925: done dumping result, returning 16142 1727204131.12940: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-fddd-f6c7-00000000007c] 16142 1727204131.12953: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007c ok: [managed-node2] => {} MSG: Using network provider: nm 16142 1727204131.13126: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007c 16142 1727204131.13130: WORKER PROCESS EXITING 16142 1727204131.13197: no more pending results, returning what we have 16142 1727204131.13200: results queue empty 16142 1727204131.13201: checking for any_errors_fatal 16142 1727204131.13213: done checking for any_errors_fatal 16142 1727204131.13213: checking for max_fail_percentage 16142 1727204131.13215: done checking for max_fail_percentage 16142 1727204131.13216: checking to see if all hosts have failed and the running result is not ok 16142 1727204131.13217: done checking to see if all hosts have failed 16142 1727204131.13217: getting the remaining hosts for this loop 16142 1727204131.13219: done getting the remaining hosts for this loop 16142 1727204131.13223: getting the next task for host managed-node2 16142 1727204131.13231: done getting next task for host managed-node2 16142 1727204131.13235: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204131.13238: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204131.13250: getting variables 16142 1727204131.13252: in VariableManager get_vars() 16142 1727204131.13306: Calling all_inventory to load vars for managed-node2 16142 1727204131.13308: Calling groups_inventory to load vars for managed-node2 16142 1727204131.13310: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204131.13320: Calling all_plugins_play to load vars for managed-node2 16142 1727204131.13322: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204131.13325: Calling groups_plugins_play to load vars for managed-node2 16142 1727204131.15907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.18167: done with get_vars() 16142 1727204131.18201: done getting variables 16142 1727204131.18386: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.117) 0:00:30.361 ***** 16142 1727204131.18542: entering _queue_task() for managed-node2/fail 16142 1727204131.19384: worker is 1 (out of 1 available) 16142 1727204131.19396: exiting _queue_task() for managed-node2/fail 16142 1727204131.19477: done queuing things up, now waiting for results queue to drain 16142 1727204131.19479: waiting for pending results... 16142 1727204131.19846: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204131.19994: in run() - task 0affcd87-79f5-fddd-f6c7-00000000007d 16142 1727204131.20008: variable 'ansible_search_path' from source: unknown 16142 1727204131.20014: variable 'ansible_search_path' from source: unknown 16142 1727204131.20058: calling self._execute() 16142 1727204131.20166: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.20174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.20185: variable 'omit' from source: magic vars 16142 1727204131.20589: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.20602: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204131.20738: variable 'network_state' from source: role '' defaults 16142 1727204131.20746: Evaluated conditional (network_state != {}): False 16142 1727204131.20749: when evaluation is False, skipping this task 16142 1727204131.20752: _execute() done 16142 1727204131.20755: dumping result to json 16142 1727204131.20758: done dumping result, returning 16142 1727204131.20768: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-fddd-f6c7-00000000007d] 16142 1727204131.20775: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007d 16142 1727204131.20877: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007d 16142 1727204131.20881: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204131.20940: no more pending results, returning what we have 16142 1727204131.20946: results queue empty 16142 1727204131.20947: checking for any_errors_fatal 16142 1727204131.20954: done checking for any_errors_fatal 16142 1727204131.20955: checking for max_fail_percentage 16142 1727204131.20957: done checking for max_fail_percentage 16142 1727204131.20958: checking to see if all hosts have failed and the running result is not ok 16142 1727204131.20959: done checking to see if all hosts have failed 16142 1727204131.20959: getting the remaining hosts for this loop 16142 1727204131.20961: done getting the remaining hosts for this loop 16142 1727204131.20967: getting the next task for host managed-node2 16142 1727204131.20976: done getting next task for host managed-node2 16142 1727204131.20981: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204131.20985: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204131.21006: getting variables 16142 1727204131.21008: in VariableManager get_vars() 16142 1727204131.21068: Calling all_inventory to load vars for managed-node2 16142 1727204131.21072: Calling groups_inventory to load vars for managed-node2 16142 1727204131.21074: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204131.21086: Calling all_plugins_play to load vars for managed-node2 16142 1727204131.21088: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204131.21091: Calling groups_plugins_play to load vars for managed-node2 16142 1727204131.24997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.28630: done with get_vars() 16142 1727204131.28670: done getting variables 16142 1727204131.28857: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.104) 0:00:30.465 ***** 16142 1727204131.28899: entering _queue_task() for managed-node2/fail 16142 1727204131.29821: worker is 1 (out of 1 available) 16142 1727204131.29833: exiting _queue_task() for managed-node2/fail 16142 1727204131.29845: done queuing things up, now waiting for results queue to drain 16142 1727204131.29846: waiting for pending results... 16142 1727204131.30651: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204131.31076: in run() - task 0affcd87-79f5-fddd-f6c7-00000000007e 16142 1727204131.31098: variable 'ansible_search_path' from source: unknown 16142 1727204131.31113: variable 'ansible_search_path' from source: unknown 16142 1727204131.31165: calling self._execute() 16142 1727204131.31491: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.31504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.31518: variable 'omit' from source: magic vars 16142 1727204131.32270: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.32387: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204131.32647: variable 'network_state' from source: role '' defaults 16142 1727204131.32758: Evaluated conditional (network_state != {}): False 16142 1727204131.32769: when evaluation is False, skipping this task 16142 1727204131.32777: _execute() done 16142 1727204131.32784: dumping result to json 16142 1727204131.32792: done dumping result, returning 16142 1727204131.32805: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-fddd-f6c7-00000000007e] 16142 1727204131.32819: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007e 16142 1727204131.32943: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007e 16142 1727204131.32949: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204131.33005: no more pending results, returning what we have 16142 1727204131.33009: results queue empty 16142 1727204131.33010: checking for any_errors_fatal 16142 1727204131.33016: done checking for any_errors_fatal 16142 1727204131.33017: checking for max_fail_percentage 16142 1727204131.33019: done checking for max_fail_percentage 16142 1727204131.33020: checking to see if all hosts have failed and the running result is not ok 16142 1727204131.33021: done checking to see if all hosts have failed 16142 1727204131.33021: getting the remaining hosts for this loop 16142 1727204131.33022: done getting the remaining hosts for this loop 16142 1727204131.33026: getting the next task for host managed-node2 16142 1727204131.33033: done getting next task for host managed-node2 16142 1727204131.33038: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204131.33041: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204131.33060: getting variables 16142 1727204131.33064: in VariableManager get_vars() 16142 1727204131.33121: Calling all_inventory to load vars for managed-node2 16142 1727204131.33124: Calling groups_inventory to load vars for managed-node2 16142 1727204131.33127: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204131.33138: Calling all_plugins_play to load vars for managed-node2 16142 1727204131.33140: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204131.33143: Calling groups_plugins_play to load vars for managed-node2 16142 1727204131.35521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.39445: done with get_vars() 16142 1727204131.39515: done getting variables 16142 1727204131.39694: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.108) 0:00:30.574 ***** 16142 1727204131.39733: entering _queue_task() for managed-node2/fail 16142 1727204131.40494: worker is 1 (out of 1 available) 16142 1727204131.40507: exiting _queue_task() for managed-node2/fail 16142 1727204131.40520: done queuing things up, now waiting for results queue to drain 16142 1727204131.40521: waiting for pending results... 16142 1727204131.41843: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204131.42206: in run() - task 0affcd87-79f5-fddd-f6c7-00000000007f 16142 1727204131.42489: variable 'ansible_search_path' from source: unknown 16142 1727204131.42498: variable 'ansible_search_path' from source: unknown 16142 1727204131.42545: calling self._execute() 16142 1727204131.42660: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.42781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.42796: variable 'omit' from source: magic vars 16142 1727204131.43587: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.43609: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204131.44002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204131.49194: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204131.49291: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204131.49348: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204131.49393: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204131.49430: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204131.49525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.49572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.49606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.49661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.49686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.49796: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.49819: Evaluated conditional (ansible_distribution_major_version | int > 9): False 16142 1727204131.49827: when evaluation is False, skipping this task 16142 1727204131.49837: _execute() done 16142 1727204131.49845: dumping result to json 16142 1727204131.49856: done dumping result, returning 16142 1727204131.49874: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-fddd-f6c7-00000000007f] 16142 1727204131.49885: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007f skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 16142 1727204131.50041: no more pending results, returning what we have 16142 1727204131.50045: results queue empty 16142 1727204131.50047: checking for any_errors_fatal 16142 1727204131.50054: done checking for any_errors_fatal 16142 1727204131.50054: checking for max_fail_percentage 16142 1727204131.50056: done checking for max_fail_percentage 16142 1727204131.50057: checking to see if all hosts have failed and the running result is not ok 16142 1727204131.50058: done checking to see if all hosts have failed 16142 1727204131.50059: getting the remaining hosts for this loop 16142 1727204131.50060: done getting the remaining hosts for this loop 16142 1727204131.50066: getting the next task for host managed-node2 16142 1727204131.50073: done getting next task for host managed-node2 16142 1727204131.50077: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204131.50080: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204131.50100: getting variables 16142 1727204131.50102: in VariableManager get_vars() 16142 1727204131.50156: Calling all_inventory to load vars for managed-node2 16142 1727204131.50158: Calling groups_inventory to load vars for managed-node2 16142 1727204131.50160: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204131.50169: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000007f 16142 1727204131.50172: WORKER PROCESS EXITING 16142 1727204131.50183: Calling all_plugins_play to load vars for managed-node2 16142 1727204131.50186: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204131.50188: Calling groups_plugins_play to load vars for managed-node2 16142 1727204131.52887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.56610: done with get_vars() 16142 1727204131.56644: done getting variables 16142 1727204131.56830: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.171) 0:00:30.745 ***** 16142 1727204131.56867: entering _queue_task() for managed-node2/dnf 16142 1727204131.57556: worker is 1 (out of 1 available) 16142 1727204131.57570: exiting _queue_task() for managed-node2/dnf 16142 1727204131.57695: done queuing things up, now waiting for results queue to drain 16142 1727204131.57697: waiting for pending results... 16142 1727204131.58612: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204131.58904: in run() - task 0affcd87-79f5-fddd-f6c7-000000000080 16142 1727204131.59015: variable 'ansible_search_path' from source: unknown 16142 1727204131.59024: variable 'ansible_search_path' from source: unknown 16142 1727204131.59074: calling self._execute() 16142 1727204131.59285: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.59297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.59311: variable 'omit' from source: magic vars 16142 1727204131.60215: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.60237: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204131.60626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204131.66299: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204131.66400: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204131.66543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204131.66585: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204131.66735: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204131.66943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.66979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.67010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.67062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.67170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.67413: variable 'ansible_distribution' from source: facts 16142 1727204131.67424: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.67450: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 16142 1727204131.67729: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204131.67937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.67963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.67990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.68038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.68050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.68109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.68121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.68149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.68194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.68209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.68254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.68285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.68312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.68355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.68370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.68538: variable 'network_connections' from source: task vars 16142 1727204131.68548: variable 'controller_profile' from source: play vars 16142 1727204131.68621: variable 'controller_profile' from source: play vars 16142 1727204131.68700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204131.68902: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204131.68942: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204131.68976: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204131.69008: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204131.69056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204131.69080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204131.69108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.69136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204131.69188: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204131.69403: variable 'network_connections' from source: task vars 16142 1727204131.69407: variable 'controller_profile' from source: play vars 16142 1727204131.69461: variable 'controller_profile' from source: play vars 16142 1727204131.69483: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204131.69487: when evaluation is False, skipping this task 16142 1727204131.69489: _execute() done 16142 1727204131.69492: dumping result to json 16142 1727204131.69494: done dumping result, returning 16142 1727204131.69511: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000080] 16142 1727204131.69528: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000080 16142 1727204131.69713: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000080 16142 1727204131.69716: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204131.70002: no more pending results, returning what we have 16142 1727204131.70006: results queue empty 16142 1727204131.70007: checking for any_errors_fatal 16142 1727204131.70013: done checking for any_errors_fatal 16142 1727204131.70014: checking for max_fail_percentage 16142 1727204131.70016: done checking for max_fail_percentage 16142 1727204131.70016: checking to see if all hosts have failed and the running result is not ok 16142 1727204131.70017: done checking to see if all hosts have failed 16142 1727204131.70018: getting the remaining hosts for this loop 16142 1727204131.70019: done getting the remaining hosts for this loop 16142 1727204131.70023: getting the next task for host managed-node2 16142 1727204131.70033: done getting next task for host managed-node2 16142 1727204131.70037: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204131.70039: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204131.70055: getting variables 16142 1727204131.70057: in VariableManager get_vars() 16142 1727204131.70116: Calling all_inventory to load vars for managed-node2 16142 1727204131.70119: Calling groups_inventory to load vars for managed-node2 16142 1727204131.70121: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204131.70130: Calling all_plugins_play to load vars for managed-node2 16142 1727204131.70132: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204131.70135: Calling groups_plugins_play to load vars for managed-node2 16142 1727204131.71863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.74005: done with get_vars() 16142 1727204131.74032: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204131.74117: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.172) 0:00:30.918 ***** 16142 1727204131.74154: entering _queue_task() for managed-node2/yum 16142 1727204131.74458: worker is 1 (out of 1 available) 16142 1727204131.74480: exiting _queue_task() for managed-node2/yum 16142 1727204131.74497: done queuing things up, now waiting for results queue to drain 16142 1727204131.74499: waiting for pending results... 16142 1727204131.74674: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204131.74767: in run() - task 0affcd87-79f5-fddd-f6c7-000000000081 16142 1727204131.74779: variable 'ansible_search_path' from source: unknown 16142 1727204131.74783: variable 'ansible_search_path' from source: unknown 16142 1727204131.74817: calling self._execute() 16142 1727204131.74898: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.74903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.74916: variable 'omit' from source: magic vars 16142 1727204131.75269: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.75289: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204131.75545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204131.78442: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204131.78501: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204131.78529: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204131.78562: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204131.78581: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204131.78640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.78659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.78684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.78710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.78721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.78798: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.78813: Evaluated conditional (ansible_distribution_major_version | int < 8): False 16142 1727204131.78816: when evaluation is False, skipping this task 16142 1727204131.78819: _execute() done 16142 1727204131.78822: dumping result to json 16142 1727204131.78826: done dumping result, returning 16142 1727204131.78836: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000081] 16142 1727204131.78839: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000081 16142 1727204131.78930: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000081 16142 1727204131.78933: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 16142 1727204131.78990: no more pending results, returning what we have 16142 1727204131.78994: results queue empty 16142 1727204131.78995: checking for any_errors_fatal 16142 1727204131.79000: done checking for any_errors_fatal 16142 1727204131.79001: checking for max_fail_percentage 16142 1727204131.79003: done checking for max_fail_percentage 16142 1727204131.79004: checking to see if all hosts have failed and the running result is not ok 16142 1727204131.79005: done checking to see if all hosts have failed 16142 1727204131.79005: getting the remaining hosts for this loop 16142 1727204131.79007: done getting the remaining hosts for this loop 16142 1727204131.79010: getting the next task for host managed-node2 16142 1727204131.79018: done getting next task for host managed-node2 16142 1727204131.79022: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204131.79025: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204131.79045: getting variables 16142 1727204131.79047: in VariableManager get_vars() 16142 1727204131.79103: Calling all_inventory to load vars for managed-node2 16142 1727204131.79106: Calling groups_inventory to load vars for managed-node2 16142 1727204131.79108: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204131.79116: Calling all_plugins_play to load vars for managed-node2 16142 1727204131.79119: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204131.79121: Calling groups_plugins_play to load vars for managed-node2 16142 1727204131.80091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.81943: done with get_vars() 16142 1727204131.81973: done getting variables 16142 1727204131.82045: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.079) 0:00:30.997 ***** 16142 1727204131.82080: entering _queue_task() for managed-node2/fail 16142 1727204131.82415: worker is 1 (out of 1 available) 16142 1727204131.82429: exiting _queue_task() for managed-node2/fail 16142 1727204131.82444: done queuing things up, now waiting for results queue to drain 16142 1727204131.82445: waiting for pending results... 16142 1727204131.82783: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204131.82925: in run() - task 0affcd87-79f5-fddd-f6c7-000000000082 16142 1727204131.82944: variable 'ansible_search_path' from source: unknown 16142 1727204131.82948: variable 'ansible_search_path' from source: unknown 16142 1727204131.82985: calling self._execute() 16142 1727204131.83089: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.83094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.83110: variable 'omit' from source: magic vars 16142 1727204131.83543: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.83571: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204131.83712: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204131.83866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204131.85622: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204131.86070: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204131.86122: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204131.86194: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204131.86245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204131.86382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.86452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.86490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.86527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.86543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.86577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.86592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.86609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.86647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.86658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.86689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.86704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.86720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.86750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.86762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.86881: variable 'network_connections' from source: task vars 16142 1727204131.86891: variable 'controller_profile' from source: play vars 16142 1727204131.86944: variable 'controller_profile' from source: play vars 16142 1727204131.86998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204131.87113: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204131.87142: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204131.87166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204131.87190: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204131.87235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204131.87253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204131.87271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.87291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204131.87331: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204131.87492: variable 'network_connections' from source: task vars 16142 1727204131.87495: variable 'controller_profile' from source: play vars 16142 1727204131.87543: variable 'controller_profile' from source: play vars 16142 1727204131.87562: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204131.87567: when evaluation is False, skipping this task 16142 1727204131.87570: _execute() done 16142 1727204131.87573: dumping result to json 16142 1727204131.87575: done dumping result, returning 16142 1727204131.87582: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000082] 16142 1727204131.87588: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000082 16142 1727204131.87687: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000082 16142 1727204131.87690: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204131.87760: no more pending results, returning what we have 16142 1727204131.87766: results queue empty 16142 1727204131.87767: checking for any_errors_fatal 16142 1727204131.87773: done checking for any_errors_fatal 16142 1727204131.87774: checking for max_fail_percentage 16142 1727204131.87777: done checking for max_fail_percentage 16142 1727204131.87778: checking to see if all hosts have failed and the running result is not ok 16142 1727204131.87778: done checking to see if all hosts have failed 16142 1727204131.87779: getting the remaining hosts for this loop 16142 1727204131.87781: done getting the remaining hosts for this loop 16142 1727204131.87784: getting the next task for host managed-node2 16142 1727204131.87789: done getting next task for host managed-node2 16142 1727204131.87794: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 16142 1727204131.87796: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204131.87813: getting variables 16142 1727204131.87815: in VariableManager get_vars() 16142 1727204131.87865: Calling all_inventory to load vars for managed-node2 16142 1727204131.87868: Calling groups_inventory to load vars for managed-node2 16142 1727204131.87870: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204131.87879: Calling all_plugins_play to load vars for managed-node2 16142 1727204131.87881: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204131.87884: Calling groups_plugins_play to load vars for managed-node2 16142 1727204131.88851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.89766: done with get_vars() 16142 1727204131.89790: done getting variables 16142 1727204131.89838: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.077) 0:00:31.075 ***** 16142 1727204131.89867: entering _queue_task() for managed-node2/package 16142 1727204131.90116: worker is 1 (out of 1 available) 16142 1727204131.90130: exiting _queue_task() for managed-node2/package 16142 1727204131.90143: done queuing things up, now waiting for results queue to drain 16142 1727204131.90144: waiting for pending results... 16142 1727204131.90340: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 16142 1727204131.90433: in run() - task 0affcd87-79f5-fddd-f6c7-000000000083 16142 1727204131.90446: variable 'ansible_search_path' from source: unknown 16142 1727204131.90457: variable 'ansible_search_path' from source: unknown 16142 1727204131.90490: calling self._execute() 16142 1727204131.90574: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204131.90578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204131.90585: variable 'omit' from source: magic vars 16142 1727204131.90886: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.90900: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204131.91051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204131.91253: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204131.91290: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204131.91315: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204131.91379: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204131.91462: variable 'network_packages' from source: role '' defaults 16142 1727204131.91542: variable '__network_provider_setup' from source: role '' defaults 16142 1727204131.91553: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204131.91602: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204131.91609: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204131.91658: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204131.91780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204131.93294: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204131.93342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204131.93372: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204131.93400: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204131.93420: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204131.93495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.93518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.93538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.93567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.93578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.93612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.93629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.93649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.93675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.93686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.93845: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204131.93921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.93940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.93962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.93988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.93998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.94068: variable 'ansible_python' from source: facts 16142 1727204131.94094: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204131.94155: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204131.94214: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204131.94304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.94321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.94340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.94368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.94381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.94413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204131.94432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204131.94451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.94480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204131.94491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204131.94589: variable 'network_connections' from source: task vars 16142 1727204131.94599: variable 'controller_profile' from source: play vars 16142 1727204131.94674: variable 'controller_profile' from source: play vars 16142 1727204131.94729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204131.94751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204131.94774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204131.94795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204131.94833: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204131.95023: variable 'network_connections' from source: task vars 16142 1727204131.95026: variable 'controller_profile' from source: play vars 16142 1727204131.95103: variable 'controller_profile' from source: play vars 16142 1727204131.95130: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204131.95187: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204131.95392: variable 'network_connections' from source: task vars 16142 1727204131.95395: variable 'controller_profile' from source: play vars 16142 1727204131.95443: variable 'controller_profile' from source: play vars 16142 1727204131.95464: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204131.95518: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204131.95721: variable 'network_connections' from source: task vars 16142 1727204131.95724: variable 'controller_profile' from source: play vars 16142 1727204131.95773: variable 'controller_profile' from source: play vars 16142 1727204131.95812: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204131.95856: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204131.95862: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204131.95909: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204131.96048: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204131.96356: variable 'network_connections' from source: task vars 16142 1727204131.96360: variable 'controller_profile' from source: play vars 16142 1727204131.96405: variable 'controller_profile' from source: play vars 16142 1727204131.96411: variable 'ansible_distribution' from source: facts 16142 1727204131.96415: variable '__network_rh_distros' from source: role '' defaults 16142 1727204131.96421: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.96433: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204131.96541: variable 'ansible_distribution' from source: facts 16142 1727204131.96544: variable '__network_rh_distros' from source: role '' defaults 16142 1727204131.96551: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.96561: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204131.96669: variable 'ansible_distribution' from source: facts 16142 1727204131.96673: variable '__network_rh_distros' from source: role '' defaults 16142 1727204131.96677: variable 'ansible_distribution_major_version' from source: facts 16142 1727204131.96703: variable 'network_provider' from source: set_fact 16142 1727204131.96714: variable 'ansible_facts' from source: unknown 16142 1727204131.97180: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 16142 1727204131.97188: when evaluation is False, skipping this task 16142 1727204131.97191: _execute() done 16142 1727204131.97193: dumping result to json 16142 1727204131.97200: done dumping result, returning 16142 1727204131.97203: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-fddd-f6c7-000000000083] 16142 1727204131.97205: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000083 16142 1727204131.97305: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000083 16142 1727204131.97308: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 16142 1727204131.97376: no more pending results, returning what we have 16142 1727204131.97380: results queue empty 16142 1727204131.97381: checking for any_errors_fatal 16142 1727204131.97387: done checking for any_errors_fatal 16142 1727204131.97388: checking for max_fail_percentage 16142 1727204131.97390: done checking for max_fail_percentage 16142 1727204131.97391: checking to see if all hosts have failed and the running result is not ok 16142 1727204131.97392: done checking to see if all hosts have failed 16142 1727204131.97393: getting the remaining hosts for this loop 16142 1727204131.97394: done getting the remaining hosts for this loop 16142 1727204131.97398: getting the next task for host managed-node2 16142 1727204131.97408: done getting next task for host managed-node2 16142 1727204131.97416: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204131.97419: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204131.97436: getting variables 16142 1727204131.97438: in VariableManager get_vars() 16142 1727204131.97486: Calling all_inventory to load vars for managed-node2 16142 1727204131.97489: Calling groups_inventory to load vars for managed-node2 16142 1727204131.97491: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204131.97500: Calling all_plugins_play to load vars for managed-node2 16142 1727204131.97502: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204131.97505: Calling groups_plugins_play to load vars for managed-node2 16142 1727204131.98325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204131.99249: done with get_vars() 16142 1727204131.99267: done getting variables 16142 1727204131.99310: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.094) 0:00:31.170 ***** 16142 1727204131.99335: entering _queue_task() for managed-node2/package 16142 1727204131.99562: worker is 1 (out of 1 available) 16142 1727204131.99578: exiting _queue_task() for managed-node2/package 16142 1727204131.99591: done queuing things up, now waiting for results queue to drain 16142 1727204131.99593: waiting for pending results... 16142 1727204131.99776: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204131.99885: in run() - task 0affcd87-79f5-fddd-f6c7-000000000084 16142 1727204131.99897: variable 'ansible_search_path' from source: unknown 16142 1727204131.99902: variable 'ansible_search_path' from source: unknown 16142 1727204131.99936: calling self._execute() 16142 1727204132.00017: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204132.00021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204132.00031: variable 'omit' from source: magic vars 16142 1727204132.00305: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.00315: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204132.00405: variable 'network_state' from source: role '' defaults 16142 1727204132.00413: Evaluated conditional (network_state != {}): False 16142 1727204132.00416: when evaluation is False, skipping this task 16142 1727204132.00419: _execute() done 16142 1727204132.00422: dumping result to json 16142 1727204132.00426: done dumping result, returning 16142 1727204132.00432: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-000000000084] 16142 1727204132.00442: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000084 16142 1727204132.00533: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000084 16142 1727204132.00536: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204132.00591: no more pending results, returning what we have 16142 1727204132.00595: results queue empty 16142 1727204132.00596: checking for any_errors_fatal 16142 1727204132.00604: done checking for any_errors_fatal 16142 1727204132.00605: checking for max_fail_percentage 16142 1727204132.00607: done checking for max_fail_percentage 16142 1727204132.00608: checking to see if all hosts have failed and the running result is not ok 16142 1727204132.00609: done checking to see if all hosts have failed 16142 1727204132.00609: getting the remaining hosts for this loop 16142 1727204132.00611: done getting the remaining hosts for this loop 16142 1727204132.00615: getting the next task for host managed-node2 16142 1727204132.00621: done getting next task for host managed-node2 16142 1727204132.00625: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204132.00627: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204132.00643: getting variables 16142 1727204132.00645: in VariableManager get_vars() 16142 1727204132.00696: Calling all_inventory to load vars for managed-node2 16142 1727204132.00699: Calling groups_inventory to load vars for managed-node2 16142 1727204132.00701: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204132.00709: Calling all_plugins_play to load vars for managed-node2 16142 1727204132.00712: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204132.00714: Calling groups_plugins_play to load vars for managed-node2 16142 1727204132.01608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204132.02521: done with get_vars() 16142 1727204132.02538: done getting variables 16142 1727204132.02582: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.032) 0:00:31.202 ***** 16142 1727204132.02606: entering _queue_task() for managed-node2/package 16142 1727204132.02824: worker is 1 (out of 1 available) 16142 1727204132.02838: exiting _queue_task() for managed-node2/package 16142 1727204132.02852: done queuing things up, now waiting for results queue to drain 16142 1727204132.02853: waiting for pending results... 16142 1727204132.03036: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204132.03132: in run() - task 0affcd87-79f5-fddd-f6c7-000000000085 16142 1727204132.03145: variable 'ansible_search_path' from source: unknown 16142 1727204132.03148: variable 'ansible_search_path' from source: unknown 16142 1727204132.03180: calling self._execute() 16142 1727204132.03253: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204132.03257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204132.03268: variable 'omit' from source: magic vars 16142 1727204132.03559: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.03570: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204132.03659: variable 'network_state' from source: role '' defaults 16142 1727204132.03668: Evaluated conditional (network_state != {}): False 16142 1727204132.03672: when evaluation is False, skipping this task 16142 1727204132.03675: _execute() done 16142 1727204132.03677: dumping result to json 16142 1727204132.03682: done dumping result, returning 16142 1727204132.03688: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-000000000085] 16142 1727204132.03695: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000085 16142 1727204132.03788: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000085 16142 1727204132.03791: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204132.03844: no more pending results, returning what we have 16142 1727204132.03848: results queue empty 16142 1727204132.03849: checking for any_errors_fatal 16142 1727204132.03854: done checking for any_errors_fatal 16142 1727204132.03855: checking for max_fail_percentage 16142 1727204132.03856: done checking for max_fail_percentage 16142 1727204132.03857: checking to see if all hosts have failed and the running result is not ok 16142 1727204132.03858: done checking to see if all hosts have failed 16142 1727204132.03858: getting the remaining hosts for this loop 16142 1727204132.03860: done getting the remaining hosts for this loop 16142 1727204132.03865: getting the next task for host managed-node2 16142 1727204132.03871: done getting next task for host managed-node2 16142 1727204132.03875: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204132.03878: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204132.03892: getting variables 16142 1727204132.03894: in VariableManager get_vars() 16142 1727204132.03945: Calling all_inventory to load vars for managed-node2 16142 1727204132.03948: Calling groups_inventory to load vars for managed-node2 16142 1727204132.03950: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204132.03957: Calling all_plugins_play to load vars for managed-node2 16142 1727204132.03958: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204132.03960: Calling groups_plugins_play to load vars for managed-node2 16142 1727204132.04729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204132.05768: done with get_vars() 16142 1727204132.05784: done getting variables 16142 1727204132.05826: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.032) 0:00:31.235 ***** 16142 1727204132.05853: entering _queue_task() for managed-node2/service 16142 1727204132.06079: worker is 1 (out of 1 available) 16142 1727204132.06093: exiting _queue_task() for managed-node2/service 16142 1727204132.06106: done queuing things up, now waiting for results queue to drain 16142 1727204132.06108: waiting for pending results... 16142 1727204132.06292: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204132.06386: in run() - task 0affcd87-79f5-fddd-f6c7-000000000086 16142 1727204132.06397: variable 'ansible_search_path' from source: unknown 16142 1727204132.06401: variable 'ansible_search_path' from source: unknown 16142 1727204132.06433: calling self._execute() 16142 1727204132.06503: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204132.06508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204132.06516: variable 'omit' from source: magic vars 16142 1727204132.06794: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.06804: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204132.06891: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204132.07023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204132.08628: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204132.08685: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204132.08717: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204132.08743: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204132.08763: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204132.08823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.08847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.08866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.08892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.08903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.08940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.08956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.08974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.08999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.09009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.09037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.09058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.09077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.09101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.09110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.09232: variable 'network_connections' from source: task vars 16142 1727204132.09242: variable 'controller_profile' from source: play vars 16142 1727204132.09300: variable 'controller_profile' from source: play vars 16142 1727204132.09352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204132.09471: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204132.09512: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204132.09539: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204132.09561: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204132.09597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204132.09613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204132.09630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.09650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204132.09691: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204132.09853: variable 'network_connections' from source: task vars 16142 1727204132.09856: variable 'controller_profile' from source: play vars 16142 1727204132.09904: variable 'controller_profile' from source: play vars 16142 1727204132.09925: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204132.09929: when evaluation is False, skipping this task 16142 1727204132.09931: _execute() done 16142 1727204132.09934: dumping result to json 16142 1727204132.09938: done dumping result, returning 16142 1727204132.09947: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000086] 16142 1727204132.09953: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000086 16142 1727204132.10048: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000086 16142 1727204132.10057: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204132.10101: no more pending results, returning what we have 16142 1727204132.10104: results queue empty 16142 1727204132.10105: checking for any_errors_fatal 16142 1727204132.10111: done checking for any_errors_fatal 16142 1727204132.10112: checking for max_fail_percentage 16142 1727204132.10114: done checking for max_fail_percentage 16142 1727204132.10115: checking to see if all hosts have failed and the running result is not ok 16142 1727204132.10115: done checking to see if all hosts have failed 16142 1727204132.10116: getting the remaining hosts for this loop 16142 1727204132.10118: done getting the remaining hosts for this loop 16142 1727204132.10121: getting the next task for host managed-node2 16142 1727204132.10127: done getting next task for host managed-node2 16142 1727204132.10132: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204132.10136: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204132.10153: getting variables 16142 1727204132.10155: in VariableManager get_vars() 16142 1727204132.10210: Calling all_inventory to load vars for managed-node2 16142 1727204132.10213: Calling groups_inventory to load vars for managed-node2 16142 1727204132.10215: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204132.10224: Calling all_plugins_play to load vars for managed-node2 16142 1727204132.10227: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204132.10229: Calling groups_plugins_play to load vars for managed-node2 16142 1727204132.11076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204132.12071: done with get_vars() 16142 1727204132.12093: done getting variables 16142 1727204132.12143: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.063) 0:00:31.298 ***** 16142 1727204132.12170: entering _queue_task() for managed-node2/service 16142 1727204132.12417: worker is 1 (out of 1 available) 16142 1727204132.12436: exiting _queue_task() for managed-node2/service 16142 1727204132.12450: done queuing things up, now waiting for results queue to drain 16142 1727204132.12451: waiting for pending results... 16142 1727204132.12640: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204132.12741: in run() - task 0affcd87-79f5-fddd-f6c7-000000000087 16142 1727204132.12751: variable 'ansible_search_path' from source: unknown 16142 1727204132.12754: variable 'ansible_search_path' from source: unknown 16142 1727204132.12790: calling self._execute() 16142 1727204132.12866: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204132.12870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204132.12883: variable 'omit' from source: magic vars 16142 1727204132.13159: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.13204: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204132.13380: variable 'network_provider' from source: set_fact 16142 1727204132.13392: variable 'network_state' from source: role '' defaults 16142 1727204132.13407: Evaluated conditional (network_provider == "nm" or network_state != {}): True 16142 1727204132.13416: variable 'omit' from source: magic vars 16142 1727204132.13488: variable 'omit' from source: magic vars 16142 1727204132.13523: variable 'network_service_name' from source: role '' defaults 16142 1727204132.13606: variable 'network_service_name' from source: role '' defaults 16142 1727204132.13738: variable '__network_provider_setup' from source: role '' defaults 16142 1727204132.13753: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204132.13837: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204132.13851: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204132.13923: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204132.14187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204132.16006: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204132.16065: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204132.16095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204132.16121: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204132.16143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204132.16206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.16226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.16247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.16278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.16289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.16321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.16338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.16355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.16523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.17188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.17433: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204132.17558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.17583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.17607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.17649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.17663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.17760: variable 'ansible_python' from source: facts 16142 1727204132.17788: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204132.17875: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204132.17954: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204132.18084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.18106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.18129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.18173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.18189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.18239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.18264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.18288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.18326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.18344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.18488: variable 'network_connections' from source: task vars 16142 1727204132.18496: variable 'controller_profile' from source: play vars 16142 1727204132.18578: variable 'controller_profile' from source: play vars 16142 1727204132.18689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204132.19256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204132.19367: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204132.19371: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204132.19392: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204132.19460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204132.19491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204132.19524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.19559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204132.19607: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204132.19903: variable 'network_connections' from source: task vars 16142 1727204132.19910: variable 'controller_profile' from source: play vars 16142 1727204132.19988: variable 'controller_profile' from source: play vars 16142 1727204132.20023: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204132.20106: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204132.20556: variable 'network_connections' from source: task vars 16142 1727204132.20560: variable 'controller_profile' from source: play vars 16142 1727204132.20633: variable 'controller_profile' from source: play vars 16142 1727204132.20660: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204132.20742: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204132.21046: variable 'network_connections' from source: task vars 16142 1727204132.21049: variable 'controller_profile' from source: play vars 16142 1727204132.21123: variable 'controller_profile' from source: play vars 16142 1727204132.21181: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204132.21242: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204132.21249: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204132.21311: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204132.21629: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204132.22134: variable 'network_connections' from source: task vars 16142 1727204132.22142: variable 'controller_profile' from source: play vars 16142 1727204132.22380: variable 'controller_profile' from source: play vars 16142 1727204132.22388: variable 'ansible_distribution' from source: facts 16142 1727204132.22391: variable '__network_rh_distros' from source: role '' defaults 16142 1727204132.22397: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.22412: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204132.22594: variable 'ansible_distribution' from source: facts 16142 1727204132.22598: variable '__network_rh_distros' from source: role '' defaults 16142 1727204132.22603: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.22616: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204132.22790: variable 'ansible_distribution' from source: facts 16142 1727204132.22793: variable '__network_rh_distros' from source: role '' defaults 16142 1727204132.22800: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.22842: variable 'network_provider' from source: set_fact 16142 1727204132.22866: variable 'omit' from source: magic vars 16142 1727204132.22899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204132.22928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204132.22952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204132.22970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204132.22982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204132.23012: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204132.23015: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204132.23017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204132.23123: Set connection var ansible_timeout to 10 16142 1727204132.23126: Set connection var ansible_connection to ssh 16142 1727204132.23132: Set connection var ansible_shell_type to sh 16142 1727204132.23142: Set connection var ansible_shell_executable to /bin/sh 16142 1727204132.23148: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204132.23156: Set connection var ansible_pipelining to False 16142 1727204132.23185: variable 'ansible_shell_executable' from source: unknown 16142 1727204132.23188: variable 'ansible_connection' from source: unknown 16142 1727204132.23191: variable 'ansible_module_compression' from source: unknown 16142 1727204132.23195: variable 'ansible_shell_type' from source: unknown 16142 1727204132.23197: variable 'ansible_shell_executable' from source: unknown 16142 1727204132.23199: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204132.23202: variable 'ansible_pipelining' from source: unknown 16142 1727204132.23204: variable 'ansible_timeout' from source: unknown 16142 1727204132.23210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204132.23655: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204132.23667: variable 'omit' from source: magic vars 16142 1727204132.23675: starting attempt loop 16142 1727204132.23677: running the handler 16142 1727204132.23768: variable 'ansible_facts' from source: unknown 16142 1727204132.24626: _low_level_execute_command(): starting 16142 1727204132.24641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204132.26560: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204132.26610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.26623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.26641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.26684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.26825: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204132.26836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.26853: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204132.26861: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204132.26870: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204132.26878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.26888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.26900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.26908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.26914: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204132.26925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.27001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204132.27020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204132.27032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204132.27111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204132.28794: stdout chunk (state=3): >>>/root <<< 16142 1727204132.28980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204132.28984: stdout chunk (state=3): >>><<< 16142 1727204132.28994: stderr chunk (state=3): >>><<< 16142 1727204132.29015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204132.29027: _low_level_execute_command(): starting 16142 1727204132.29034: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913 `" && echo ansible-tmp-1727204132.2901525-18572-216397920484913="` echo /root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913 `" ) && sleep 0' 16142 1727204132.30520: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204132.30532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.30543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.30557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.30604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.30607: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204132.30618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.30632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204132.30641: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204132.30648: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204132.30655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.30666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.30681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.30688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.30694: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204132.30704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.30779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204132.30801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204132.30814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204132.30891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204132.32809: stdout chunk (state=3): >>>ansible-tmp-1727204132.2901525-18572-216397920484913=/root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913 <<< 16142 1727204132.32924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204132.33022: stderr chunk (state=3): >>><<< 16142 1727204132.33028: stdout chunk (state=3): >>><<< 16142 1727204132.33059: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204132.2901525-18572-216397920484913=/root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204132.33096: variable 'ansible_module_compression' from source: unknown 16142 1727204132.33158: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 16142 1727204132.33228: variable 'ansible_facts' from source: unknown 16142 1727204132.33446: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913/AnsiballZ_systemd.py 16142 1727204132.34061: Sending initial data 16142 1727204132.34066: Sent initial data (156 bytes) 16142 1727204132.36584: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204132.36591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.36601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.36616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.36668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.36674: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204132.36752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.36767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204132.36776: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204132.36783: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204132.36791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.36801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.36813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.36821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.36828: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204132.36839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.36915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204132.36979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204132.36992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204132.37158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204132.38866: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204132.38897: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204132.38943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpc3fqg2aw /root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913/AnsiballZ_systemd.py <<< 16142 1727204132.38979: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204132.42278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204132.42405: stderr chunk (state=3): >>><<< 16142 1727204132.42409: stdout chunk (state=3): >>><<< 16142 1727204132.42411: done transferring module to remote 16142 1727204132.42415: _low_level_execute_command(): starting 16142 1727204132.42418: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913/ /root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913/AnsiballZ_systemd.py && sleep 0' 16142 1727204132.45465: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.45471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.45489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.45497: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204132.45508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.45526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204132.45535: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204132.45544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204132.45552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.45561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.45575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.45582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.45588: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204132.45597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.45679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204132.45698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204132.45710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204132.45785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204132.47671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204132.47675: stdout chunk (state=3): >>><<< 16142 1727204132.47681: stderr chunk (state=3): >>><<< 16142 1727204132.47701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204132.47704: _low_level_execute_command(): starting 16142 1727204132.47709: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913/AnsiballZ_systemd.py && sleep 0' 16142 1727204132.48973: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204132.48993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.49009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.49026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.49073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.49087: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204132.49105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.49125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204132.49137: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204132.49148: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204132.49160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.49177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.49193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.49210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.49222: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204132.49235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.49313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204132.49343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204132.49346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204132.49548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204132.75141: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6799360", "MemoryAvailable": "infinity", "CPUUsageNSec": "884854000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft":<<< 16142 1727204132.75192: stdout chunk (state=3): >>> "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 16142 1727204132.76735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204132.76741: stdout chunk (state=3): >>><<< 16142 1727204132.76743: stderr chunk (state=3): >>><<< 16142 1727204132.76770: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6799360", "MemoryAvailable": "infinity", "CPUUsageNSec": "884854000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204132.76976: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204132.76993: _low_level_execute_command(): starting 16142 1727204132.77003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204132.2901525-18572-216397920484913/ > /dev/null 2>&1 && sleep 0' 16142 1727204132.78374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204132.78497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.78522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.78548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.78593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.78608: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204132.78629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.78651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204132.78666: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204132.78683: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204132.78696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204132.78710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204132.78747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204132.78766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204132.78782: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204132.78801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204132.78889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204132.78914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204132.78932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204132.79079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204132.80809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204132.80905: stderr chunk (state=3): >>><<< 16142 1727204132.80909: stdout chunk (state=3): >>><<< 16142 1727204132.80932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204132.80938: handler run complete 16142 1727204132.81005: attempt loop complete, returning result 16142 1727204132.81008: _execute() done 16142 1727204132.81011: dumping result to json 16142 1727204132.81027: done dumping result, returning 16142 1727204132.81039: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-fddd-f6c7-000000000087] 16142 1727204132.81041: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000087 16142 1727204132.81345: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000087 16142 1727204132.81349: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204132.81413: no more pending results, returning what we have 16142 1727204132.81417: results queue empty 16142 1727204132.81418: checking for any_errors_fatal 16142 1727204132.81424: done checking for any_errors_fatal 16142 1727204132.81425: checking for max_fail_percentage 16142 1727204132.81427: done checking for max_fail_percentage 16142 1727204132.81428: checking to see if all hosts have failed and the running result is not ok 16142 1727204132.81428: done checking to see if all hosts have failed 16142 1727204132.81429: getting the remaining hosts for this loop 16142 1727204132.81431: done getting the remaining hosts for this loop 16142 1727204132.81435: getting the next task for host managed-node2 16142 1727204132.81442: done getting next task for host managed-node2 16142 1727204132.81446: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204132.81449: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204132.81461: getting variables 16142 1727204132.81465: in VariableManager get_vars() 16142 1727204132.81519: Calling all_inventory to load vars for managed-node2 16142 1727204132.81522: Calling groups_inventory to load vars for managed-node2 16142 1727204132.81524: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204132.81536: Calling all_plugins_play to load vars for managed-node2 16142 1727204132.81539: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204132.81542: Calling groups_plugins_play to load vars for managed-node2 16142 1727204132.83409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204132.85106: done with get_vars() 16142 1727204132.85138: done getting variables 16142 1727204132.85210: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.730) 0:00:32.029 ***** 16142 1727204132.85251: entering _queue_task() for managed-node2/service 16142 1727204132.85617: worker is 1 (out of 1 available) 16142 1727204132.85634: exiting _queue_task() for managed-node2/service 16142 1727204132.85648: done queuing things up, now waiting for results queue to drain 16142 1727204132.85649: waiting for pending results... 16142 1727204132.85965: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204132.86103: in run() - task 0affcd87-79f5-fddd-f6c7-000000000088 16142 1727204132.86115: variable 'ansible_search_path' from source: unknown 16142 1727204132.86119: variable 'ansible_search_path' from source: unknown 16142 1727204132.86157: calling self._execute() 16142 1727204132.86261: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204132.86267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204132.86280: variable 'omit' from source: magic vars 16142 1727204132.87069: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.87073: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204132.87075: variable 'network_provider' from source: set_fact 16142 1727204132.87078: Evaluated conditional (network_provider == "nm"): True 16142 1727204132.87080: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204132.87082: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204132.87195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204132.89642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204132.89715: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204132.89759: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204132.89800: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204132.89826: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204132.89925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.89953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.89986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.90028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.90042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.90096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.90119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.90142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.90182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.90204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.90242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204132.90266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204132.90290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.90337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204132.90349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204132.90507: variable 'network_connections' from source: task vars 16142 1727204132.90527: variable 'controller_profile' from source: play vars 16142 1727204132.90599: variable 'controller_profile' from source: play vars 16142 1727204132.90680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204132.90854: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204132.90890: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204132.90916: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204132.90940: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204132.90988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204132.91006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204132.91027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204132.91051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204132.91113: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204132.91379: variable 'network_connections' from source: task vars 16142 1727204132.91390: variable 'controller_profile' from source: play vars 16142 1727204132.91458: variable 'controller_profile' from source: play vars 16142 1727204132.91494: Evaluated conditional (__network_wpa_supplicant_required): False 16142 1727204132.91497: when evaluation is False, skipping this task 16142 1727204132.91499: _execute() done 16142 1727204132.91509: dumping result to json 16142 1727204132.91512: done dumping result, returning 16142 1727204132.91521: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-fddd-f6c7-000000000088] 16142 1727204132.91532: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000088 16142 1727204132.91625: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000088 16142 1727204132.91628: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 16142 1727204132.91681: no more pending results, returning what we have 16142 1727204132.91686: results queue empty 16142 1727204132.91687: checking for any_errors_fatal 16142 1727204132.91712: done checking for any_errors_fatal 16142 1727204132.91713: checking for max_fail_percentage 16142 1727204132.91716: done checking for max_fail_percentage 16142 1727204132.91717: checking to see if all hosts have failed and the running result is not ok 16142 1727204132.91718: done checking to see if all hosts have failed 16142 1727204132.91719: getting the remaining hosts for this loop 16142 1727204132.91720: done getting the remaining hosts for this loop 16142 1727204132.91725: getting the next task for host managed-node2 16142 1727204132.91732: done getting next task for host managed-node2 16142 1727204132.91737: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204132.91740: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204132.91758: getting variables 16142 1727204132.91761: in VariableManager get_vars() 16142 1727204132.91824: Calling all_inventory to load vars for managed-node2 16142 1727204132.91827: Calling groups_inventory to load vars for managed-node2 16142 1727204132.91830: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204132.91840: Calling all_plugins_play to load vars for managed-node2 16142 1727204132.91843: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204132.91847: Calling groups_plugins_play to load vars for managed-node2 16142 1727204132.93574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204132.95306: done with get_vars() 16142 1727204132.95342: done getting variables 16142 1727204132.95406: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.101) 0:00:32.131 ***** 16142 1727204132.95446: entering _queue_task() for managed-node2/service 16142 1727204132.95796: worker is 1 (out of 1 available) 16142 1727204132.95810: exiting _queue_task() for managed-node2/service 16142 1727204132.95822: done queuing things up, now waiting for results queue to drain 16142 1727204132.95824: waiting for pending results... 16142 1727204132.96124: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204132.96251: in run() - task 0affcd87-79f5-fddd-f6c7-000000000089 16142 1727204132.96269: variable 'ansible_search_path' from source: unknown 16142 1727204132.96272: variable 'ansible_search_path' from source: unknown 16142 1727204132.96315: calling self._execute() 16142 1727204132.96414: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204132.96421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204132.96431: variable 'omit' from source: magic vars 16142 1727204132.96838: variable 'ansible_distribution_major_version' from source: facts 16142 1727204132.96849: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204132.96974: variable 'network_provider' from source: set_fact 16142 1727204132.96980: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204132.96983: when evaluation is False, skipping this task 16142 1727204132.96986: _execute() done 16142 1727204132.96989: dumping result to json 16142 1727204132.96994: done dumping result, returning 16142 1727204132.97001: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-fddd-f6c7-000000000089] 16142 1727204132.97009: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000089 16142 1727204132.97106: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000089 16142 1727204132.97109: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204132.97178: no more pending results, returning what we have 16142 1727204132.97183: results queue empty 16142 1727204132.97184: checking for any_errors_fatal 16142 1727204132.97193: done checking for any_errors_fatal 16142 1727204132.97194: checking for max_fail_percentage 16142 1727204132.97196: done checking for max_fail_percentage 16142 1727204132.97197: checking to see if all hosts have failed and the running result is not ok 16142 1727204132.97198: done checking to see if all hosts have failed 16142 1727204132.97198: getting the remaining hosts for this loop 16142 1727204132.97200: done getting the remaining hosts for this loop 16142 1727204132.97204: getting the next task for host managed-node2 16142 1727204132.97213: done getting next task for host managed-node2 16142 1727204132.97217: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204132.97220: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204132.97240: getting variables 16142 1727204132.97242: in VariableManager get_vars() 16142 1727204132.97304: Calling all_inventory to load vars for managed-node2 16142 1727204132.97307: Calling groups_inventory to load vars for managed-node2 16142 1727204132.97310: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204132.97322: Calling all_plugins_play to load vars for managed-node2 16142 1727204132.97325: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204132.97328: Calling groups_plugins_play to load vars for managed-node2 16142 1727204132.99075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204133.00810: done with get_vars() 16142 1727204133.00841: done getting variables 16142 1727204133.00904: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.054) 0:00:32.186 ***** 16142 1727204133.00944: entering _queue_task() for managed-node2/copy 16142 1727204133.01271: worker is 1 (out of 1 available) 16142 1727204133.01285: exiting _queue_task() for managed-node2/copy 16142 1727204133.01297: done queuing things up, now waiting for results queue to drain 16142 1727204133.01298: waiting for pending results... 16142 1727204133.01606: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204133.01744: in run() - task 0affcd87-79f5-fddd-f6c7-00000000008a 16142 1727204133.01765: variable 'ansible_search_path' from source: unknown 16142 1727204133.01773: variable 'ansible_search_path' from source: unknown 16142 1727204133.01819: calling self._execute() 16142 1727204133.01932: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204133.01946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204133.01971: variable 'omit' from source: magic vars 16142 1727204133.02387: variable 'ansible_distribution_major_version' from source: facts 16142 1727204133.02409: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204133.02539: variable 'network_provider' from source: set_fact 16142 1727204133.02551: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204133.02559: when evaluation is False, skipping this task 16142 1727204133.02571: _execute() done 16142 1727204133.02581: dumping result to json 16142 1727204133.02588: done dumping result, returning 16142 1727204133.02599: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-fddd-f6c7-00000000008a] 16142 1727204133.02617: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008a skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204133.02787: no more pending results, returning what we have 16142 1727204133.02792: results queue empty 16142 1727204133.02793: checking for any_errors_fatal 16142 1727204133.02801: done checking for any_errors_fatal 16142 1727204133.02802: checking for max_fail_percentage 16142 1727204133.02804: done checking for max_fail_percentage 16142 1727204133.02805: checking to see if all hosts have failed and the running result is not ok 16142 1727204133.02806: done checking to see if all hosts have failed 16142 1727204133.02807: getting the remaining hosts for this loop 16142 1727204133.02809: done getting the remaining hosts for this loop 16142 1727204133.02814: getting the next task for host managed-node2 16142 1727204133.02822: done getting next task for host managed-node2 16142 1727204133.02826: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204133.02830: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204133.02852: getting variables 16142 1727204133.02855: in VariableManager get_vars() 16142 1727204133.02917: Calling all_inventory to load vars for managed-node2 16142 1727204133.02920: Calling groups_inventory to load vars for managed-node2 16142 1727204133.02922: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204133.02935: Calling all_plugins_play to load vars for managed-node2 16142 1727204133.02938: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204133.02942: Calling groups_plugins_play to load vars for managed-node2 16142 1727204133.03914: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008a 16142 1727204133.03918: WORKER PROCESS EXITING 16142 1727204133.04744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204133.06571: done with get_vars() 16142 1727204133.06595: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.057) 0:00:32.243 ***** 16142 1727204133.06690: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204133.07041: worker is 1 (out of 1 available) 16142 1727204133.07055: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204133.07071: done queuing things up, now waiting for results queue to drain 16142 1727204133.07073: waiting for pending results... 16142 1727204133.07379: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204133.07536: in run() - task 0affcd87-79f5-fddd-f6c7-00000000008b 16142 1727204133.07557: variable 'ansible_search_path' from source: unknown 16142 1727204133.07567: variable 'ansible_search_path' from source: unknown 16142 1727204133.07609: calling self._execute() 16142 1727204133.07715: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204133.07728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204133.07751: variable 'omit' from source: magic vars 16142 1727204133.08141: variable 'ansible_distribution_major_version' from source: facts 16142 1727204133.08160: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204133.08178: variable 'omit' from source: magic vars 16142 1727204133.08242: variable 'omit' from source: magic vars 16142 1727204133.08426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204133.10877: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204133.10957: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204133.11009: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204133.11054: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204133.11090: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204133.11177: variable 'network_provider' from source: set_fact 16142 1727204133.11323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204133.11380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204133.11412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204133.11468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204133.11489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204133.11573: variable 'omit' from source: magic vars 16142 1727204133.11699: variable 'omit' from source: magic vars 16142 1727204133.11811: variable 'network_connections' from source: task vars 16142 1727204133.11828: variable 'controller_profile' from source: play vars 16142 1727204133.11899: variable 'controller_profile' from source: play vars 16142 1727204133.12051: variable 'omit' from source: magic vars 16142 1727204133.12068: variable '__lsr_ansible_managed' from source: task vars 16142 1727204133.12140: variable '__lsr_ansible_managed' from source: task vars 16142 1727204133.12342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 16142 1727204133.12586: Loaded config def from plugin (lookup/template) 16142 1727204133.12595: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 16142 1727204133.12629: File lookup term: get_ansible_managed.j2 16142 1727204133.12641: variable 'ansible_search_path' from source: unknown 16142 1727204133.12654: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 16142 1727204133.12672: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 16142 1727204133.12695: variable 'ansible_search_path' from source: unknown 16142 1727204133.25746: variable 'ansible_managed' from source: unknown 16142 1727204133.25905: variable 'omit' from source: magic vars 16142 1727204133.25933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204133.25960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204133.25982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204133.26008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204133.26021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204133.26044: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204133.26052: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204133.26059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204133.26161: Set connection var ansible_timeout to 10 16142 1727204133.26171: Set connection var ansible_connection to ssh 16142 1727204133.26180: Set connection var ansible_shell_type to sh 16142 1727204133.26189: Set connection var ansible_shell_executable to /bin/sh 16142 1727204133.26197: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204133.26207: Set connection var ansible_pipelining to False 16142 1727204133.26239: variable 'ansible_shell_executable' from source: unknown 16142 1727204133.26246: variable 'ansible_connection' from source: unknown 16142 1727204133.26252: variable 'ansible_module_compression' from source: unknown 16142 1727204133.26258: variable 'ansible_shell_type' from source: unknown 16142 1727204133.26265: variable 'ansible_shell_executable' from source: unknown 16142 1727204133.26272: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204133.26279: variable 'ansible_pipelining' from source: unknown 16142 1727204133.26285: variable 'ansible_timeout' from source: unknown 16142 1727204133.26292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204133.26423: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204133.26452: variable 'omit' from source: magic vars 16142 1727204133.26463: starting attempt loop 16142 1727204133.26473: running the handler 16142 1727204133.26488: _low_level_execute_command(): starting 16142 1727204133.26495: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204133.27231: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204133.27249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.27266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.27285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.27328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.27345: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204133.27360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.27379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204133.27393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204133.27404: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204133.27415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.27427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.27443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.27460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.27474: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204133.27490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.27571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204133.27604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204133.27625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204133.27771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204133.29389: stdout chunk (state=3): >>>/root <<< 16142 1727204133.29490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204133.29602: stderr chunk (state=3): >>><<< 16142 1727204133.29605: stdout chunk (state=3): >>><<< 16142 1727204133.29671: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204133.29675: _low_level_execute_command(): starting 16142 1727204133.29678: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525 `" && echo ansible-tmp-1727204133.2962883-18610-278635670080525="` echo /root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525 `" ) && sleep 0' 16142 1727204133.30576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204133.30592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.30609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.30632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.30681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.30742: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204133.30758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.30779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204133.30791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204133.30803: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204133.30817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.30833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.30856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.30919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.30937: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204133.30959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.31040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204133.31144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204133.31162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204133.31245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204133.33174: stdout chunk (state=3): >>>ansible-tmp-1727204133.2962883-18610-278635670080525=/root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525 <<< 16142 1727204133.33303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204133.33438: stderr chunk (state=3): >>><<< 16142 1727204133.33457: stdout chunk (state=3): >>><<< 16142 1727204133.33773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204133.2962883-18610-278635670080525=/root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204133.33781: variable 'ansible_module_compression' from source: unknown 16142 1727204133.33784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 16142 1727204133.33786: variable 'ansible_facts' from source: unknown 16142 1727204133.33788: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525/AnsiballZ_network_connections.py 16142 1727204133.33987: Sending initial data 16142 1727204133.33991: Sent initial data (168 bytes) 16142 1727204133.35276: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204133.35304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.35325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.35353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.35440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.35522: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204133.35548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.35570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204133.35583: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204133.35595: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204133.35607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.35622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.35647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.35661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.35677: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204133.35691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.35778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204133.35800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204133.35815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204133.35893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204133.37712: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204133.37790: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204133.38138: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpybk8c5_m /root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525/AnsiballZ_network_connections.py <<< 16142 1727204133.38143: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204133.39498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204133.39677: stderr chunk (state=3): >>><<< 16142 1727204133.39681: stdout chunk (state=3): >>><<< 16142 1727204133.39683: done transferring module to remote 16142 1727204133.39685: _low_level_execute_command(): starting 16142 1727204133.39689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525/ /root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525/AnsiballZ_network_connections.py && sleep 0' 16142 1727204133.41331: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204133.41459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.41482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.41505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.41675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.41687: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204133.41700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.41716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204133.41729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204133.41743: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204133.41755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.41773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.41789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.41800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.41810: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204133.41822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.41905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204133.41990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204133.42007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204133.42080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204133.43993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204133.43997: stdout chunk (state=3): >>><<< 16142 1727204133.43999: stderr chunk (state=3): >>><<< 16142 1727204133.44113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204133.44122: _low_level_execute_command(): starting 16142 1727204133.44125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525/AnsiballZ_network_connections.py && sleep 0' 16142 1727204133.45913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204133.46017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.46038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.46058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.46138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.46152: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204133.46170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.46189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204133.46217: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204133.46269: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204133.46289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.46324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.46344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.46357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.46373: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204133.46398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.46530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204133.46590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204133.46641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204133.46732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204133.86449: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_25xpojgi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_25xpojgi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/73afa86b-f147-47bf-9096-10366249563c: error=unknown <<< 16142 1727204133.86774: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 16142 1727204133.88888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204133.88922: stderr chunk (state=3): >>><<< 16142 1727204133.88925: stdout chunk (state=3): >>><<< 16142 1727204133.89057: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_25xpojgi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_25xpojgi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/73afa86b-f147-47bf-9096-10366249563c: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204133.89061: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204133.89065: _low_level_execute_command(): starting 16142 1727204133.89069: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204133.2962883-18610-278635670080525/ > /dev/null 2>&1 && sleep 0' 16142 1727204133.89611: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204133.89633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.89648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.89671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.89716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.89729: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204133.89743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.89761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204133.89775: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204133.89786: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204133.89798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204133.89811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204133.89827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204133.89840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204133.89851: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204133.89863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204133.89937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204133.89953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204133.89969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204133.90052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204133.91921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204133.91925: stdout chunk (state=3): >>><<< 16142 1727204133.91931: stderr chunk (state=3): >>><<< 16142 1727204133.91962: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204133.91967: handler run complete 16142 1727204133.91998: attempt loop complete, returning result 16142 1727204133.92002: _execute() done 16142 1727204133.92004: dumping result to json 16142 1727204133.92006: done dumping result, returning 16142 1727204133.92017: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-fddd-f6c7-00000000008b] 16142 1727204133.92019: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008b 16142 1727204133.92133: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008b 16142 1727204133.92137: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 16142 1727204133.92227: no more pending results, returning what we have 16142 1727204133.92231: results queue empty 16142 1727204133.92232: checking for any_errors_fatal 16142 1727204133.92236: done checking for any_errors_fatal 16142 1727204133.92237: checking for max_fail_percentage 16142 1727204133.92238: done checking for max_fail_percentage 16142 1727204133.92239: checking to see if all hosts have failed and the running result is not ok 16142 1727204133.92240: done checking to see if all hosts have failed 16142 1727204133.92241: getting the remaining hosts for this loop 16142 1727204133.92242: done getting the remaining hosts for this loop 16142 1727204133.92245: getting the next task for host managed-node2 16142 1727204133.92250: done getting next task for host managed-node2 16142 1727204133.92254: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204133.92257: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204133.92272: getting variables 16142 1727204133.92274: in VariableManager get_vars() 16142 1727204133.92324: Calling all_inventory to load vars for managed-node2 16142 1727204133.92327: Calling groups_inventory to load vars for managed-node2 16142 1727204133.92329: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204133.92338: Calling all_plugins_play to load vars for managed-node2 16142 1727204133.92341: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204133.92343: Calling groups_plugins_play to load vars for managed-node2 16142 1727204133.93546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204133.94460: done with get_vars() 16142 1727204133.94483: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.878) 0:00:33.122 ***** 16142 1727204133.94549: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204133.94802: worker is 1 (out of 1 available) 16142 1727204133.94820: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204133.94833: done queuing things up, now waiting for results queue to drain 16142 1727204133.94834: waiting for pending results... 16142 1727204133.95162: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204133.95311: in run() - task 0affcd87-79f5-fddd-f6c7-00000000008c 16142 1727204133.95338: variable 'ansible_search_path' from source: unknown 16142 1727204133.95350: variable 'ansible_search_path' from source: unknown 16142 1727204133.95395: calling self._execute() 16142 1727204133.95501: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204133.95512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204133.95526: variable 'omit' from source: magic vars 16142 1727204133.95943: variable 'ansible_distribution_major_version' from source: facts 16142 1727204133.95961: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204133.96095: variable 'network_state' from source: role '' defaults 16142 1727204133.96114: Evaluated conditional (network_state != {}): False 16142 1727204133.96121: when evaluation is False, skipping this task 16142 1727204133.96127: _execute() done 16142 1727204133.96138: dumping result to json 16142 1727204133.96146: done dumping result, returning 16142 1727204133.96155: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-fddd-f6c7-00000000008c] 16142 1727204133.96173: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008c 16142 1727204133.96287: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008c 16142 1727204133.96295: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204133.96421: no more pending results, returning what we have 16142 1727204133.96426: results queue empty 16142 1727204133.96427: checking for any_errors_fatal 16142 1727204133.96439: done checking for any_errors_fatal 16142 1727204133.96440: checking for max_fail_percentage 16142 1727204133.96442: done checking for max_fail_percentage 16142 1727204133.96443: checking to see if all hosts have failed and the running result is not ok 16142 1727204133.96443: done checking to see if all hosts have failed 16142 1727204133.96444: getting the remaining hosts for this loop 16142 1727204133.96446: done getting the remaining hosts for this loop 16142 1727204133.96450: getting the next task for host managed-node2 16142 1727204133.96457: done getting next task for host managed-node2 16142 1727204133.96461: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204133.96466: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204133.96490: getting variables 16142 1727204133.96492: in VariableManager get_vars() 16142 1727204133.96543: Calling all_inventory to load vars for managed-node2 16142 1727204133.96546: Calling groups_inventory to load vars for managed-node2 16142 1727204133.96548: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204133.96557: Calling all_plugins_play to load vars for managed-node2 16142 1727204133.96559: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204133.96562: Calling groups_plugins_play to load vars for managed-node2 16142 1727204133.98137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204133.99496: done with get_vars() 16142 1727204133.99525: done getting variables 16142 1727204133.99574: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.050) 0:00:33.172 ***** 16142 1727204133.99600: entering _queue_task() for managed-node2/debug 16142 1727204133.99849: worker is 1 (out of 1 available) 16142 1727204133.99866: exiting _queue_task() for managed-node2/debug 16142 1727204133.99877: done queuing things up, now waiting for results queue to drain 16142 1727204133.99878: waiting for pending results... 16142 1727204134.00084: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204134.00182: in run() - task 0affcd87-79f5-fddd-f6c7-00000000008d 16142 1727204134.00194: variable 'ansible_search_path' from source: unknown 16142 1727204134.00199: variable 'ansible_search_path' from source: unknown 16142 1727204134.00230: calling self._execute() 16142 1727204134.00301: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.00305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.00317: variable 'omit' from source: magic vars 16142 1727204134.00771: variable 'ansible_distribution_major_version' from source: facts 16142 1727204134.00775: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204134.00778: variable 'omit' from source: magic vars 16142 1727204134.00781: variable 'omit' from source: magic vars 16142 1727204134.00784: variable 'omit' from source: magic vars 16142 1727204134.00827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204134.00871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204134.00887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204134.00911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204134.00930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204134.00975: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204134.00985: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.00993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.01110: Set connection var ansible_timeout to 10 16142 1727204134.01118: Set connection var ansible_connection to ssh 16142 1727204134.01127: Set connection var ansible_shell_type to sh 16142 1727204134.01136: Set connection var ansible_shell_executable to /bin/sh 16142 1727204134.01145: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204134.01182: Set connection var ansible_pipelining to False 16142 1727204134.01222: variable 'ansible_shell_executable' from source: unknown 16142 1727204134.01229: variable 'ansible_connection' from source: unknown 16142 1727204134.01235: variable 'ansible_module_compression' from source: unknown 16142 1727204134.01242: variable 'ansible_shell_type' from source: unknown 16142 1727204134.01247: variable 'ansible_shell_executable' from source: unknown 16142 1727204134.01253: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.01271: variable 'ansible_pipelining' from source: unknown 16142 1727204134.01279: variable 'ansible_timeout' from source: unknown 16142 1727204134.01287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.01432: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204134.01450: variable 'omit' from source: magic vars 16142 1727204134.01460: starting attempt loop 16142 1727204134.01470: running the handler 16142 1727204134.01625: variable '__network_connections_result' from source: set_fact 16142 1727204134.01684: handler run complete 16142 1727204134.01717: attempt loop complete, returning result 16142 1727204134.01725: _execute() done 16142 1727204134.01732: dumping result to json 16142 1727204134.01740: done dumping result, returning 16142 1727204134.01753: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-fddd-f6c7-00000000008d] 16142 1727204134.01762: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008d 16142 1727204134.01888: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008d 16142 1727204134.01891: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 16142 1727204134.02002: no more pending results, returning what we have 16142 1727204134.02006: results queue empty 16142 1727204134.02007: checking for any_errors_fatal 16142 1727204134.02014: done checking for any_errors_fatal 16142 1727204134.02015: checking for max_fail_percentage 16142 1727204134.02017: done checking for max_fail_percentage 16142 1727204134.02018: checking to see if all hosts have failed and the running result is not ok 16142 1727204134.02018: done checking to see if all hosts have failed 16142 1727204134.02019: getting the remaining hosts for this loop 16142 1727204134.02021: done getting the remaining hosts for this loop 16142 1727204134.02024: getting the next task for host managed-node2 16142 1727204134.02031: done getting next task for host managed-node2 16142 1727204134.02035: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204134.02038: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204134.02051: getting variables 16142 1727204134.02053: in VariableManager get_vars() 16142 1727204134.02155: Calling all_inventory to load vars for managed-node2 16142 1727204134.02158: Calling groups_inventory to load vars for managed-node2 16142 1727204134.02160: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204134.02172: Calling all_plugins_play to load vars for managed-node2 16142 1727204134.02175: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204134.02178: Calling groups_plugins_play to load vars for managed-node2 16142 1727204134.03653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204134.04578: done with get_vars() 16142 1727204134.04597: done getting variables 16142 1727204134.04644: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.050) 0:00:33.223 ***** 16142 1727204134.04671: entering _queue_task() for managed-node2/debug 16142 1727204134.04910: worker is 1 (out of 1 available) 16142 1727204134.04925: exiting _queue_task() for managed-node2/debug 16142 1727204134.04940: done queuing things up, now waiting for results queue to drain 16142 1727204134.04941: waiting for pending results... 16142 1727204134.05131: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204134.05248: in run() - task 0affcd87-79f5-fddd-f6c7-00000000008e 16142 1727204134.05286: variable 'ansible_search_path' from source: unknown 16142 1727204134.05300: variable 'ansible_search_path' from source: unknown 16142 1727204134.05376: calling self._execute() 16142 1727204134.05547: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.05561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.05579: variable 'omit' from source: magic vars 16142 1727204134.05939: variable 'ansible_distribution_major_version' from source: facts 16142 1727204134.05958: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204134.05972: variable 'omit' from source: magic vars 16142 1727204134.06026: variable 'omit' from source: magic vars 16142 1727204134.06062: variable 'omit' from source: magic vars 16142 1727204134.06106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204134.06145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204134.06175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204134.06198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204134.06213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204134.06247: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204134.06254: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.06261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.06367: Set connection var ansible_timeout to 10 16142 1727204134.06376: Set connection var ansible_connection to ssh 16142 1727204134.06387: Set connection var ansible_shell_type to sh 16142 1727204134.06400: Set connection var ansible_shell_executable to /bin/sh 16142 1727204134.06410: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204134.06422: Set connection var ansible_pipelining to False 16142 1727204134.06449: variable 'ansible_shell_executable' from source: unknown 16142 1727204134.06456: variable 'ansible_connection' from source: unknown 16142 1727204134.06462: variable 'ansible_module_compression' from source: unknown 16142 1727204134.06471: variable 'ansible_shell_type' from source: unknown 16142 1727204134.06477: variable 'ansible_shell_executable' from source: unknown 16142 1727204134.06482: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.06489: variable 'ansible_pipelining' from source: unknown 16142 1727204134.06494: variable 'ansible_timeout' from source: unknown 16142 1727204134.06501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.06636: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204134.06653: variable 'omit' from source: magic vars 16142 1727204134.06665: starting attempt loop 16142 1727204134.06672: running the handler 16142 1727204134.06720: variable '__network_connections_result' from source: set_fact 16142 1727204134.06803: variable '__network_connections_result' from source: set_fact 16142 1727204134.06916: handler run complete 16142 1727204134.06946: attempt loop complete, returning result 16142 1727204134.06953: _execute() done 16142 1727204134.06958: dumping result to json 16142 1727204134.06967: done dumping result, returning 16142 1727204134.06979: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-fddd-f6c7-00000000008e] 16142 1727204134.06992: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008e ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 16142 1727204134.07187: no more pending results, returning what we have 16142 1727204134.07192: results queue empty 16142 1727204134.07193: checking for any_errors_fatal 16142 1727204134.07198: done checking for any_errors_fatal 16142 1727204134.07199: checking for max_fail_percentage 16142 1727204134.07201: done checking for max_fail_percentage 16142 1727204134.07202: checking to see if all hosts have failed and the running result is not ok 16142 1727204134.07202: done checking to see if all hosts have failed 16142 1727204134.07203: getting the remaining hosts for this loop 16142 1727204134.07205: done getting the remaining hosts for this loop 16142 1727204134.07208: getting the next task for host managed-node2 16142 1727204134.07215: done getting next task for host managed-node2 16142 1727204134.07219: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204134.07222: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204134.07237: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008e 16142 1727204134.07241: WORKER PROCESS EXITING 16142 1727204134.07248: getting variables 16142 1727204134.07250: in VariableManager get_vars() 16142 1727204134.07306: Calling all_inventory to load vars for managed-node2 16142 1727204134.07309: Calling groups_inventory to load vars for managed-node2 16142 1727204134.07311: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204134.07320: Calling all_plugins_play to load vars for managed-node2 16142 1727204134.07323: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204134.07325: Calling groups_plugins_play to load vars for managed-node2 16142 1727204134.13903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204134.15619: done with get_vars() 16142 1727204134.15650: done getting variables 16142 1727204134.15706: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.110) 0:00:33.334 ***** 16142 1727204134.15747: entering _queue_task() for managed-node2/debug 16142 1727204134.16104: worker is 1 (out of 1 available) 16142 1727204134.16117: exiting _queue_task() for managed-node2/debug 16142 1727204134.16129: done queuing things up, now waiting for results queue to drain 16142 1727204134.16130: waiting for pending results... 16142 1727204134.16468: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204134.16581: in run() - task 0affcd87-79f5-fddd-f6c7-00000000008f 16142 1727204134.16596: variable 'ansible_search_path' from source: unknown 16142 1727204134.16601: variable 'ansible_search_path' from source: unknown 16142 1727204134.16630: calling self._execute() 16142 1727204134.16708: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.16713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.16722: variable 'omit' from source: magic vars 16142 1727204134.17006: variable 'ansible_distribution_major_version' from source: facts 16142 1727204134.17018: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204134.17106: variable 'network_state' from source: role '' defaults 16142 1727204134.17114: Evaluated conditional (network_state != {}): False 16142 1727204134.17118: when evaluation is False, skipping this task 16142 1727204134.17121: _execute() done 16142 1727204134.17123: dumping result to json 16142 1727204134.17126: done dumping result, returning 16142 1727204134.17133: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-fddd-f6c7-00000000008f] 16142 1727204134.17147: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008f 16142 1727204134.17245: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000008f 16142 1727204134.17248: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 16142 1727204134.17295: no more pending results, returning what we have 16142 1727204134.17299: results queue empty 16142 1727204134.17300: checking for any_errors_fatal 16142 1727204134.17312: done checking for any_errors_fatal 16142 1727204134.17313: checking for max_fail_percentage 16142 1727204134.17315: done checking for max_fail_percentage 16142 1727204134.17315: checking to see if all hosts have failed and the running result is not ok 16142 1727204134.17316: done checking to see if all hosts have failed 16142 1727204134.17317: getting the remaining hosts for this loop 16142 1727204134.17318: done getting the remaining hosts for this loop 16142 1727204134.17322: getting the next task for host managed-node2 16142 1727204134.17329: done getting next task for host managed-node2 16142 1727204134.17336: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204134.17339: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204134.17357: getting variables 16142 1727204134.17359: in VariableManager get_vars() 16142 1727204134.17408: Calling all_inventory to load vars for managed-node2 16142 1727204134.17411: Calling groups_inventory to load vars for managed-node2 16142 1727204134.17412: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204134.17421: Calling all_plugins_play to load vars for managed-node2 16142 1727204134.17423: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204134.17426: Calling groups_plugins_play to load vars for managed-node2 16142 1727204134.18411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204134.19838: done with get_vars() 16142 1727204134.19858: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.041) 0:00:33.376 ***** 16142 1727204134.19932: entering _queue_task() for managed-node2/ping 16142 1727204134.20176: worker is 1 (out of 1 available) 16142 1727204134.20192: exiting _queue_task() for managed-node2/ping 16142 1727204134.20203: done queuing things up, now waiting for results queue to drain 16142 1727204134.20204: waiting for pending results... 16142 1727204134.20395: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204134.20487: in run() - task 0affcd87-79f5-fddd-f6c7-000000000090 16142 1727204134.20498: variable 'ansible_search_path' from source: unknown 16142 1727204134.20501: variable 'ansible_search_path' from source: unknown 16142 1727204134.20531: calling self._execute() 16142 1727204134.20610: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.20614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.20622: variable 'omit' from source: magic vars 16142 1727204134.20914: variable 'ansible_distribution_major_version' from source: facts 16142 1727204134.20924: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204134.20930: variable 'omit' from source: magic vars 16142 1727204134.20982: variable 'omit' from source: magic vars 16142 1727204134.21012: variable 'omit' from source: magic vars 16142 1727204134.21045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204134.21093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204134.21121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204134.21134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204134.21146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204134.21187: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204134.21203: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.21213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.21324: Set connection var ansible_timeout to 10 16142 1727204134.21332: Set connection var ansible_connection to ssh 16142 1727204134.21346: Set connection var ansible_shell_type to sh 16142 1727204134.21356: Set connection var ansible_shell_executable to /bin/sh 16142 1727204134.21369: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204134.21382: Set connection var ansible_pipelining to False 16142 1727204134.21415: variable 'ansible_shell_executable' from source: unknown 16142 1727204134.21428: variable 'ansible_connection' from source: unknown 16142 1727204134.21441: variable 'ansible_module_compression' from source: unknown 16142 1727204134.21449: variable 'ansible_shell_type' from source: unknown 16142 1727204134.21455: variable 'ansible_shell_executable' from source: unknown 16142 1727204134.21462: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.21472: variable 'ansible_pipelining' from source: unknown 16142 1727204134.21478: variable 'ansible_timeout' from source: unknown 16142 1727204134.21484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.21716: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204134.21744: variable 'omit' from source: magic vars 16142 1727204134.21758: starting attempt loop 16142 1727204134.21768: running the handler 16142 1727204134.21787: _low_level_execute_command(): starting 16142 1727204134.21799: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204134.22599: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.22623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.22643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.22662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.22707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.22723: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.22745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.22766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.22780: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.22793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.22806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.22821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.22843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.22859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.22882: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.22897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.22982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.22998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.23011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.23096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.24750: stdout chunk (state=3): >>>/root <<< 16142 1727204134.24898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.25029: stderr chunk (state=3): >>><<< 16142 1727204134.25033: stdout chunk (state=3): >>><<< 16142 1727204134.25070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204134.25163: _low_level_execute_command(): starting 16142 1727204134.25170: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692 `" && echo ansible-tmp-1727204134.2505584-18655-162205294319692="` echo /root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692 `" ) && sleep 0' 16142 1727204134.25833: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.25845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.25870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.25920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204134.25924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.25927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.26000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.26018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.26032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.26105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.27958: stdout chunk (state=3): >>>ansible-tmp-1727204134.2505584-18655-162205294319692=/root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692 <<< 16142 1727204134.28073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.28125: stderr chunk (state=3): >>><<< 16142 1727204134.28128: stdout chunk (state=3): >>><<< 16142 1727204134.28145: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204134.2505584-18655-162205294319692=/root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204134.28272: variable 'ansible_module_compression' from source: unknown 16142 1727204134.28276: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 16142 1727204134.28381: variable 'ansible_facts' from source: unknown 16142 1727204134.28384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692/AnsiballZ_ping.py 16142 1727204134.28569: Sending initial data 16142 1727204134.28572: Sent initial data (153 bytes) 16142 1727204134.30211: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.30246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.30249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.30296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.30299: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.30316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204134.30318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.30370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.30379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.30423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.32187: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204134.32219: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204134.32273: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpvxzhd6hv /root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692/AnsiballZ_ping.py <<< 16142 1727204134.32306: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204134.33494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.33672: stderr chunk (state=3): >>><<< 16142 1727204134.33676: stdout chunk (state=3): >>><<< 16142 1727204134.33678: done transferring module to remote 16142 1727204134.33680: _low_level_execute_command(): starting 16142 1727204134.33683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692/ /root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692/AnsiballZ_ping.py && sleep 0' 16142 1727204134.35183: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.35187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.35209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.35216: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.35227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.35244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.35253: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.35256: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.35267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.35278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.35292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.35300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.35308: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.35322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.35396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.35414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.35427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.35499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.37274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.37279: stdout chunk (state=3): >>><<< 16142 1727204134.37285: stderr chunk (state=3): >>><<< 16142 1727204134.37300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204134.37303: _low_level_execute_command(): starting 16142 1727204134.37308: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692/AnsiballZ_ping.py && sleep 0' 16142 1727204134.37989: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.37997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.38014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.38022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.38071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.38078: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.38089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.38103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.38110: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.38118: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.38123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.38133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.38154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.38160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.38170: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.38179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.38252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.38275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.38289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.38366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.51473: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 16142 1727204134.52613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204134.52618: stdout chunk (state=3): >>><<< 16142 1727204134.52621: stderr chunk (state=3): >>><<< 16142 1727204134.52643: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204134.52668: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204134.52683: _low_level_execute_command(): starting 16142 1727204134.52687: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204134.2505584-18655-162205294319692/ > /dev/null 2>&1 && sleep 0' 16142 1727204134.53919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.53923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.54589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.54606: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.54622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.54646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.54660: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.54673: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.54686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.54700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.54718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.54731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.54746: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.54760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.54846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.54867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.54883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.54958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.56893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.56897: stdout chunk (state=3): >>><<< 16142 1727204134.56899: stderr chunk (state=3): >>><<< 16142 1727204134.57074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204134.57078: handler run complete 16142 1727204134.57081: attempt loop complete, returning result 16142 1727204134.57083: _execute() done 16142 1727204134.57085: dumping result to json 16142 1727204134.57087: done dumping result, returning 16142 1727204134.57089: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-fddd-f6c7-000000000090] 16142 1727204134.57091: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000090 16142 1727204134.57170: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000090 16142 1727204134.57173: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 16142 1727204134.57238: no more pending results, returning what we have 16142 1727204134.57242: results queue empty 16142 1727204134.57243: checking for any_errors_fatal 16142 1727204134.57249: done checking for any_errors_fatal 16142 1727204134.57250: checking for max_fail_percentage 16142 1727204134.57251: done checking for max_fail_percentage 16142 1727204134.57252: checking to see if all hosts have failed and the running result is not ok 16142 1727204134.57253: done checking to see if all hosts have failed 16142 1727204134.57254: getting the remaining hosts for this loop 16142 1727204134.57255: done getting the remaining hosts for this loop 16142 1727204134.57258: getting the next task for host managed-node2 16142 1727204134.57269: done getting next task for host managed-node2 16142 1727204134.57272: ^ task is: TASK: meta (role_complete) 16142 1727204134.57275: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204134.57286: getting variables 16142 1727204134.57288: in VariableManager get_vars() 16142 1727204134.57342: Calling all_inventory to load vars for managed-node2 16142 1727204134.57345: Calling groups_inventory to load vars for managed-node2 16142 1727204134.57347: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204134.57356: Calling all_plugins_play to load vars for managed-node2 16142 1727204134.57358: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204134.57360: Calling groups_plugins_play to load vars for managed-node2 16142 1727204134.59291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204134.62322: done with get_vars() 16142 1727204134.62353: done getting variables 16142 1727204134.62563: done queuing things up, now waiting for results queue to drain 16142 1727204134.62568: results queue empty 16142 1727204134.62569: checking for any_errors_fatal 16142 1727204134.62572: done checking for any_errors_fatal 16142 1727204134.62573: checking for max_fail_percentage 16142 1727204134.62574: done checking for max_fail_percentage 16142 1727204134.62575: checking to see if all hosts have failed and the running result is not ok 16142 1727204134.62575: done checking to see if all hosts have failed 16142 1727204134.62576: getting the remaining hosts for this loop 16142 1727204134.62577: done getting the remaining hosts for this loop 16142 1727204134.62581: getting the next task for host managed-node2 16142 1727204134.62586: done getting next task for host managed-node2 16142 1727204134.62588: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 16142 1727204134.62590: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204134.62593: getting variables 16142 1727204134.62594: in VariableManager get_vars() 16142 1727204134.62616: Calling all_inventory to load vars for managed-node2 16142 1727204134.62619: Calling groups_inventory to load vars for managed-node2 16142 1727204134.62621: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204134.62627: Calling all_plugins_play to load vars for managed-node2 16142 1727204134.62629: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204134.62646: Calling groups_plugins_play to load vars for managed-node2 16142 1727204134.63899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204134.66378: done with get_vars() 16142 1727204134.66406: done getting variables 16142 1727204134.66456: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204134.66589: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.466) 0:00:33.843 ***** 16142 1727204134.66625: entering _queue_task() for managed-node2/command 16142 1727204134.67299: worker is 1 (out of 1 available) 16142 1727204134.67313: exiting _queue_task() for managed-node2/command 16142 1727204134.67326: done queuing things up, now waiting for results queue to drain 16142 1727204134.67327: waiting for pending results... 16142 1727204134.68128: running TaskExecutor() for managed-node2/TASK: From the active connection, get the port1 profile "bond0.0" 16142 1727204134.68242: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000c0 16142 1727204134.68271: variable 'ansible_search_path' from source: unknown 16142 1727204134.68314: calling self._execute() 16142 1727204134.68420: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.68432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.68453: variable 'omit' from source: magic vars 16142 1727204134.68861: variable 'ansible_distribution_major_version' from source: facts 16142 1727204134.68881: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204134.69025: variable 'network_provider' from source: set_fact 16142 1727204134.69053: Evaluated conditional (network_provider == "nm"): True 16142 1727204134.69074: variable 'omit' from source: magic vars 16142 1727204134.69116: variable 'omit' from source: magic vars 16142 1727204134.69229: variable 'port1_profile' from source: play vars 16142 1727204134.69276: variable 'omit' from source: magic vars 16142 1727204134.69365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204134.69408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204134.69437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204134.69487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204134.69511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204134.69551: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204134.69562: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.69576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.69696: Set connection var ansible_timeout to 10 16142 1727204134.69703: Set connection var ansible_connection to ssh 16142 1727204134.69713: Set connection var ansible_shell_type to sh 16142 1727204134.69723: Set connection var ansible_shell_executable to /bin/sh 16142 1727204134.69733: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204134.69778: Set connection var ansible_pipelining to False 16142 1727204134.69867: variable 'ansible_shell_executable' from source: unknown 16142 1727204134.69886: variable 'ansible_connection' from source: unknown 16142 1727204134.69915: variable 'ansible_module_compression' from source: unknown 16142 1727204134.69954: variable 'ansible_shell_type' from source: unknown 16142 1727204134.69962: variable 'ansible_shell_executable' from source: unknown 16142 1727204134.69970: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204134.69977: variable 'ansible_pipelining' from source: unknown 16142 1727204134.69983: variable 'ansible_timeout' from source: unknown 16142 1727204134.69989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204134.70146: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204134.70162: variable 'omit' from source: magic vars 16142 1727204134.70174: starting attempt loop 16142 1727204134.70180: running the handler 16142 1727204134.70205: _low_level_execute_command(): starting 16142 1727204134.70218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204134.71151: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.71169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.71184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.71283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.71510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.71524: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.71553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.71587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.71602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.71617: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.71657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.71681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.71699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.71714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.71727: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.71742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.71828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.71846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.71860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.71933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.73569: stdout chunk (state=3): >>>/root <<< 16142 1727204134.73673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.73754: stderr chunk (state=3): >>><<< 16142 1727204134.73757: stdout chunk (state=3): >>><<< 16142 1727204134.73881: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204134.73884: _low_level_execute_command(): starting 16142 1727204134.73887: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335 `" && echo ansible-tmp-1727204134.737843-18685-57337695594335="` echo /root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335 `" ) && sleep 0' 16142 1727204134.74766: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.74782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.74798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.74823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.74868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.74881: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.74896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.74914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.74933: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.74946: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.74958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.74973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.74987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.74999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.75010: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.75023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.75118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.75143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.75200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.75294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.77146: stdout chunk (state=3): >>>ansible-tmp-1727204134.737843-18685-57337695594335=/root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335 <<< 16142 1727204134.77269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.77348: stderr chunk (state=3): >>><<< 16142 1727204134.77360: stdout chunk (state=3): >>><<< 16142 1727204134.77576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204134.737843-18685-57337695594335=/root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204134.77580: variable 'ansible_module_compression' from source: unknown 16142 1727204134.77583: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204134.77585: variable 'ansible_facts' from source: unknown 16142 1727204134.77603: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335/AnsiballZ_command.py 16142 1727204134.77761: Sending initial data 16142 1727204134.77767: Sent initial data (154 bytes) 16142 1727204134.78726: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.78741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.78755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.78777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.78820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.78831: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.78844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.78860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.78875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.78886: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.78898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.78910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.78923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.78934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.78944: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.78956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.79031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.79053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.79070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.79137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.80869: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204134.80898: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204134.80939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpr589dl7w /root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335/AnsiballZ_command.py <<< 16142 1727204134.80973: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204134.82128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.82257: stderr chunk (state=3): >>><<< 16142 1727204134.82261: stdout chunk (state=3): >>><<< 16142 1727204134.82263: done transferring module to remote 16142 1727204134.82271: _low_level_execute_command(): starting 16142 1727204134.82278: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335/ /root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335/AnsiballZ_command.py && sleep 0' 16142 1727204134.82838: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.82853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.82871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.82890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.82934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.82947: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.82962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.82982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.82993: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.83007: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.83019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.83032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.83048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.83061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.83083: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.83098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.83173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.83190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.83205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.83296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204134.85058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204134.85135: stderr chunk (state=3): >>><<< 16142 1727204134.85147: stdout chunk (state=3): >>><<< 16142 1727204134.85176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204134.85181: _low_level_execute_command(): starting 16142 1727204134.85187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335/AnsiballZ_command.py && sleep 0' 16142 1727204134.85928: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204134.85946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.85957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.85973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.86014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.86022: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204134.86035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.86057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204134.86065: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204134.86075: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204134.86083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204134.86092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204134.86103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204134.86110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204134.86117: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204134.86126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204134.86211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204134.86229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204134.86246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204134.86335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204135.01533: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-24 14:55:34.995199", "end": "2024-09-24 14:55:35.014599", "delta": "0:00:00.019400", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204135.02827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204135.02832: stdout chunk (state=3): >>><<< 16142 1727204135.02843: stderr chunk (state=3): >>><<< 16142 1727204135.02883: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-24 14:55:34.995199", "end": "2024-09-24 14:55:35.014599", "delta": "0:00:00.019400", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204135.02927: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204135.02934: _low_level_execute_command(): starting 16142 1727204135.02942: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204134.737843-18685-57337695594335/ > /dev/null 2>&1 && sleep 0' 16142 1727204135.04121: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204135.04138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204135.04154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204135.04175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.04218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204135.04231: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204135.04248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.04270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204135.04283: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204135.04296: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204135.04309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204135.04323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204135.04339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.04353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204135.04368: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204135.04383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.04457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204135.04486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204135.04504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204135.04583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204135.06381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204135.06406: stderr chunk (state=3): >>><<< 16142 1727204135.06410: stdout chunk (state=3): >>><<< 16142 1727204135.06426: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204135.06431: handler run complete 16142 1727204135.06456: Evaluated conditional (False): False 16142 1727204135.06466: attempt loop complete, returning result 16142 1727204135.06469: _execute() done 16142 1727204135.06472: dumping result to json 16142 1727204135.06477: done dumping result, returning 16142 1727204135.06485: done running TaskExecutor() for managed-node2/TASK: From the active connection, get the port1 profile "bond0.0" [0affcd87-79f5-fddd-f6c7-0000000000c0] 16142 1727204135.06490: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c0 16142 1727204135.06590: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c0 16142 1727204135.06593: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.0" ], "delta": "0:00:00.019400", "end": "2024-09-24 14:55:35.014599", "rc": 0, "start": "2024-09-24 14:55:34.995199" } 16142 1727204135.06677: no more pending results, returning what we have 16142 1727204135.06681: results queue empty 16142 1727204135.06682: checking for any_errors_fatal 16142 1727204135.06684: done checking for any_errors_fatal 16142 1727204135.06684: checking for max_fail_percentage 16142 1727204135.06686: done checking for max_fail_percentage 16142 1727204135.06687: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.06688: done checking to see if all hosts have failed 16142 1727204135.06689: getting the remaining hosts for this loop 16142 1727204135.06690: done getting the remaining hosts for this loop 16142 1727204135.06694: getting the next task for host managed-node2 16142 1727204135.06700: done getting next task for host managed-node2 16142 1727204135.06703: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 16142 1727204135.06705: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.06709: getting variables 16142 1727204135.06710: in VariableManager get_vars() 16142 1727204135.06795: Calling all_inventory to load vars for managed-node2 16142 1727204135.06798: Calling groups_inventory to load vars for managed-node2 16142 1727204135.06800: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.06810: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.06813: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.06816: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.08223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.09168: done with get_vars() 16142 1727204135.09187: done getting variables 16142 1727204135.09235: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204135.09322: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.427) 0:00:34.270 ***** 16142 1727204135.09354: entering _queue_task() for managed-node2/command 16142 1727204135.09594: worker is 1 (out of 1 available) 16142 1727204135.09607: exiting _queue_task() for managed-node2/command 16142 1727204135.09621: done queuing things up, now waiting for results queue to drain 16142 1727204135.09622: waiting for pending results... 16142 1727204135.09809: running TaskExecutor() for managed-node2/TASK: From the active connection, get the port2 profile "bond0.1" 16142 1727204135.09877: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000c1 16142 1727204135.09894: variable 'ansible_search_path' from source: unknown 16142 1727204135.09924: calling self._execute() 16142 1727204135.10015: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.10018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.10087: variable 'omit' from source: magic vars 16142 1727204135.10530: variable 'ansible_distribution_major_version' from source: facts 16142 1727204135.10568: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204135.10702: variable 'network_provider' from source: set_fact 16142 1727204135.10714: Evaluated conditional (network_provider == "nm"): True 16142 1727204135.10730: variable 'omit' from source: magic vars 16142 1727204135.10756: variable 'omit' from source: magic vars 16142 1727204135.10867: variable 'port2_profile' from source: play vars 16142 1727204135.10898: variable 'omit' from source: magic vars 16142 1727204135.10951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204135.11000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204135.11029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204135.11061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204135.11082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204135.11127: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204135.11140: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.11148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.11270: Set connection var ansible_timeout to 10 16142 1727204135.11280: Set connection var ansible_connection to ssh 16142 1727204135.11289: Set connection var ansible_shell_type to sh 16142 1727204135.11298: Set connection var ansible_shell_executable to /bin/sh 16142 1727204135.11306: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204135.11326: Set connection var ansible_pipelining to False 16142 1727204135.11355: variable 'ansible_shell_executable' from source: unknown 16142 1727204135.11362: variable 'ansible_connection' from source: unknown 16142 1727204135.11371: variable 'ansible_module_compression' from source: unknown 16142 1727204135.11380: variable 'ansible_shell_type' from source: unknown 16142 1727204135.11386: variable 'ansible_shell_executable' from source: unknown 16142 1727204135.11392: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.11399: variable 'ansible_pipelining' from source: unknown 16142 1727204135.11405: variable 'ansible_timeout' from source: unknown 16142 1727204135.11412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.11692: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204135.11713: variable 'omit' from source: magic vars 16142 1727204135.11722: starting attempt loop 16142 1727204135.11729: running the handler 16142 1727204135.11784: _low_level_execute_command(): starting 16142 1727204135.11803: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204135.12340: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.12368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.12383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.12432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204135.12448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204135.12502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204135.14067: stdout chunk (state=3): >>>/root <<< 16142 1727204135.14369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204135.14373: stdout chunk (state=3): >>><<< 16142 1727204135.14375: stderr chunk (state=3): >>><<< 16142 1727204135.14378: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204135.14381: _low_level_execute_command(): starting 16142 1727204135.14383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984 `" && echo ansible-tmp-1727204135.1428928-18705-37588890631984="` echo /root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984 `" ) && sleep 0' 16142 1727204135.15027: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204135.15036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204135.15050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204135.15063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.15102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204135.15115: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204135.15118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.15131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204135.15143: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204135.15149: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204135.15157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204135.15170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204135.15180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.15187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204135.15194: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204135.15203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.15278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204135.15295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204135.15306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204135.15378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204135.17209: stdout chunk (state=3): >>>ansible-tmp-1727204135.1428928-18705-37588890631984=/root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984 <<< 16142 1727204135.17337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204135.17405: stderr chunk (state=3): >>><<< 16142 1727204135.17411: stdout chunk (state=3): >>><<< 16142 1727204135.17436: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204135.1428928-18705-37588890631984=/root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204135.17548: variable 'ansible_module_compression' from source: unknown 16142 1727204135.17552: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204135.17577: variable 'ansible_facts' from source: unknown 16142 1727204135.17645: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984/AnsiballZ_command.py 16142 1727204135.17793: Sending initial data 16142 1727204135.17797: Sent initial data (155 bytes) 16142 1727204135.18734: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204135.18738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.18778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204135.18782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204135.18784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.18841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204135.18844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204135.18891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204135.20588: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 16142 1727204135.20593: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204135.20628: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204135.20673: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmprpwb9r3w /root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984/AnsiballZ_command.py <<< 16142 1727204135.20709: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204135.21498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204135.21609: stderr chunk (state=3): >>><<< 16142 1727204135.21613: stdout chunk (state=3): >>><<< 16142 1727204135.21631: done transferring module to remote 16142 1727204135.21642: _low_level_execute_command(): starting 16142 1727204135.21646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984/ /root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984/AnsiballZ_command.py && sleep 0' 16142 1727204135.22116: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.22140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.22152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.22170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.22217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204135.22229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204135.22288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204135.24014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204135.24071: stderr chunk (state=3): >>><<< 16142 1727204135.24075: stdout chunk (state=3): >>><<< 16142 1727204135.24089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204135.24092: _low_level_execute_command(): starting 16142 1727204135.24097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984/AnsiballZ_command.py && sleep 0' 16142 1727204135.24554: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204135.24573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.24591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.24601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.24649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204135.24661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204135.24714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204135.39694: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-24 14:55:35.376849", "end": "2024-09-24 14:55:35.396177", "delta": "0:00:00.019328", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204135.41019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204135.41023: stderr chunk (state=3): >>><<< 16142 1727204135.41026: stdout chunk (state=3): >>><<< 16142 1727204135.41052: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-24 14:55:35.376849", "end": "2024-09-24 14:55:35.396177", "delta": "0:00:00.019328", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204135.41093: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204135.41101: _low_level_execute_command(): starting 16142 1727204135.41106: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204135.1428928-18705-37588890631984/ > /dev/null 2>&1 && sleep 0' 16142 1727204135.42486: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204135.42494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204135.42504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204135.42518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.42597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204135.42634: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204135.42650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.42667: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204135.42676: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204135.42684: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204135.42692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204135.42728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204135.42742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204135.42750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204135.42757: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204135.42772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204135.42848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204135.42992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204135.43006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204135.43076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204135.44953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204135.44957: stdout chunk (state=3): >>><<< 16142 1727204135.44966: stderr chunk (state=3): >>><<< 16142 1727204135.45003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204135.45009: handler run complete 16142 1727204135.45035: Evaluated conditional (False): False 16142 1727204135.45049: attempt loop complete, returning result 16142 1727204135.45052: _execute() done 16142 1727204135.45054: dumping result to json 16142 1727204135.45059: done dumping result, returning 16142 1727204135.45070: done running TaskExecutor() for managed-node2/TASK: From the active connection, get the port2 profile "bond0.1" [0affcd87-79f5-fddd-f6c7-0000000000c1] 16142 1727204135.45076: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c1 16142 1727204135.45184: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c1 16142 1727204135.45187: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.1" ], "delta": "0:00:00.019328", "end": "2024-09-24 14:55:35.396177", "rc": 0, "start": "2024-09-24 14:55:35.376849" } 16142 1727204135.45255: no more pending results, returning what we have 16142 1727204135.45259: results queue empty 16142 1727204135.45260: checking for any_errors_fatal 16142 1727204135.45270: done checking for any_errors_fatal 16142 1727204135.45271: checking for max_fail_percentage 16142 1727204135.45274: done checking for max_fail_percentage 16142 1727204135.45275: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.45276: done checking to see if all hosts have failed 16142 1727204135.45277: getting the remaining hosts for this loop 16142 1727204135.45278: done getting the remaining hosts for this loop 16142 1727204135.45283: getting the next task for host managed-node2 16142 1727204135.45289: done getting next task for host managed-node2 16142 1727204135.45292: ^ task is: TASK: Assert that the port1 profile is not activated 16142 1727204135.45294: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.45298: getting variables 16142 1727204135.45300: in VariableManager get_vars() 16142 1727204135.45354: Calling all_inventory to load vars for managed-node2 16142 1727204135.45357: Calling groups_inventory to load vars for managed-node2 16142 1727204135.45359: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.45371: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.45373: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.45376: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.47280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.49174: done with get_vars() 16142 1727204135.49201: done getting variables 16142 1727204135.49273: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.399) 0:00:34.669 ***** 16142 1727204135.49305: entering _queue_task() for managed-node2/assert 16142 1727204135.49650: worker is 1 (out of 1 available) 16142 1727204135.49667: exiting _queue_task() for managed-node2/assert 16142 1727204135.49681: done queuing things up, now waiting for results queue to drain 16142 1727204135.49682: waiting for pending results... 16142 1727204135.50013: running TaskExecutor() for managed-node2/TASK: Assert that the port1 profile is not activated 16142 1727204135.50119: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000c2 16142 1727204135.50149: variable 'ansible_search_path' from source: unknown 16142 1727204135.50197: calling self._execute() 16142 1727204135.50324: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.50335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.50362: variable 'omit' from source: magic vars 16142 1727204135.50962: variable 'ansible_distribution_major_version' from source: facts 16142 1727204135.50984: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204135.51109: variable 'network_provider' from source: set_fact 16142 1727204135.51124: Evaluated conditional (network_provider == "nm"): True 16142 1727204135.51141: variable 'omit' from source: magic vars 16142 1727204135.51167: variable 'omit' from source: magic vars 16142 1727204135.51278: variable 'port1_profile' from source: play vars 16142 1727204135.51316: variable 'omit' from source: magic vars 16142 1727204135.51370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204135.51412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204135.51440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204135.51473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204135.51491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204135.51526: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204135.51538: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.51548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.51662: Set connection var ansible_timeout to 10 16142 1727204135.51678: Set connection var ansible_connection to ssh 16142 1727204135.51691: Set connection var ansible_shell_type to sh 16142 1727204135.51702: Set connection var ansible_shell_executable to /bin/sh 16142 1727204135.51712: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204135.51723: Set connection var ansible_pipelining to False 16142 1727204135.51753: variable 'ansible_shell_executable' from source: unknown 16142 1727204135.51761: variable 'ansible_connection' from source: unknown 16142 1727204135.51771: variable 'ansible_module_compression' from source: unknown 16142 1727204135.51782: variable 'ansible_shell_type' from source: unknown 16142 1727204135.51793: variable 'ansible_shell_executable' from source: unknown 16142 1727204135.51801: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.51809: variable 'ansible_pipelining' from source: unknown 16142 1727204135.51816: variable 'ansible_timeout' from source: unknown 16142 1727204135.51823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.51968: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204135.51987: variable 'omit' from source: magic vars 16142 1727204135.52005: starting attempt loop 16142 1727204135.52016: running the handler 16142 1727204135.52192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204135.54778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204135.54873: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204135.54915: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204135.54965: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204135.54996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204135.55077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204135.55110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204135.55140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204135.55192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204135.55210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204135.55327: variable 'active_port1_profile' from source: set_fact 16142 1727204135.55352: Evaluated conditional (active_port1_profile.stdout | length == 0): True 16142 1727204135.55367: handler run complete 16142 1727204135.55392: attempt loop complete, returning result 16142 1727204135.55399: _execute() done 16142 1727204135.55405: dumping result to json 16142 1727204135.55412: done dumping result, returning 16142 1727204135.55422: done running TaskExecutor() for managed-node2/TASK: Assert that the port1 profile is not activated [0affcd87-79f5-fddd-f6c7-0000000000c2] 16142 1727204135.55431: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c2 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204135.55590: no more pending results, returning what we have 16142 1727204135.55594: results queue empty 16142 1727204135.55595: checking for any_errors_fatal 16142 1727204135.55601: done checking for any_errors_fatal 16142 1727204135.55602: checking for max_fail_percentage 16142 1727204135.55604: done checking for max_fail_percentage 16142 1727204135.55605: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.55605: done checking to see if all hosts have failed 16142 1727204135.55606: getting the remaining hosts for this loop 16142 1727204135.55608: done getting the remaining hosts for this loop 16142 1727204135.55611: getting the next task for host managed-node2 16142 1727204135.55619: done getting next task for host managed-node2 16142 1727204135.55621: ^ task is: TASK: Assert that the port2 profile is not activated 16142 1727204135.55624: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.55627: getting variables 16142 1727204135.55630: in VariableManager get_vars() 16142 1727204135.55687: Calling all_inventory to load vars for managed-node2 16142 1727204135.55697: Calling groups_inventory to load vars for managed-node2 16142 1727204135.55699: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.55710: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.55713: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.55716: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.56743: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c2 16142 1727204135.56746: WORKER PROCESS EXITING 16142 1727204135.57517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.59506: done with get_vars() 16142 1727204135.59534: done getting variables 16142 1727204135.59606: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.103) 0:00:34.773 ***** 16142 1727204135.59638: entering _queue_task() for managed-node2/assert 16142 1727204135.59992: worker is 1 (out of 1 available) 16142 1727204135.60008: exiting _queue_task() for managed-node2/assert 16142 1727204135.60021: done queuing things up, now waiting for results queue to drain 16142 1727204135.60023: waiting for pending results... 16142 1727204135.60321: running TaskExecutor() for managed-node2/TASK: Assert that the port2 profile is not activated 16142 1727204135.60435: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000c3 16142 1727204135.60461: variable 'ansible_search_path' from source: unknown 16142 1727204135.60506: calling self._execute() 16142 1727204135.60626: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.60638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.60655: variable 'omit' from source: magic vars 16142 1727204135.61085: variable 'ansible_distribution_major_version' from source: facts 16142 1727204135.61110: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204135.61240: variable 'network_provider' from source: set_fact 16142 1727204135.61255: Evaluated conditional (network_provider == "nm"): True 16142 1727204135.61271: variable 'omit' from source: magic vars 16142 1727204135.61294: variable 'omit' from source: magic vars 16142 1727204135.61406: variable 'port2_profile' from source: play vars 16142 1727204135.61432: variable 'omit' from source: magic vars 16142 1727204135.61481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204135.61524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204135.61561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204135.61586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204135.61603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204135.61639: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204135.61651: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.61659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.61776: Set connection var ansible_timeout to 10 16142 1727204135.61785: Set connection var ansible_connection to ssh 16142 1727204135.61795: Set connection var ansible_shell_type to sh 16142 1727204135.61803: Set connection var ansible_shell_executable to /bin/sh 16142 1727204135.61812: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204135.61822: Set connection var ansible_pipelining to False 16142 1727204135.61850: variable 'ansible_shell_executable' from source: unknown 16142 1727204135.61858: variable 'ansible_connection' from source: unknown 16142 1727204135.61874: variable 'ansible_module_compression' from source: unknown 16142 1727204135.61886: variable 'ansible_shell_type' from source: unknown 16142 1727204135.61894: variable 'ansible_shell_executable' from source: unknown 16142 1727204135.61900: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.61907: variable 'ansible_pipelining' from source: unknown 16142 1727204135.61912: variable 'ansible_timeout' from source: unknown 16142 1727204135.61919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.62072: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204135.62093: variable 'omit' from source: magic vars 16142 1727204135.62105: starting attempt loop 16142 1727204135.62114: running the handler 16142 1727204135.62298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204135.65855: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204135.65943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204135.65994: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204135.66134: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204135.66166: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204135.66242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204135.66278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204135.66307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204135.66359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204135.66380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204135.66499: variable 'active_port2_profile' from source: set_fact 16142 1727204135.66527: Evaluated conditional (active_port2_profile.stdout | length == 0): True 16142 1727204135.66544: handler run complete 16142 1727204135.66577: attempt loop complete, returning result 16142 1727204135.66584: _execute() done 16142 1727204135.66590: dumping result to json 16142 1727204135.66597: done dumping result, returning 16142 1727204135.66608: done running TaskExecutor() for managed-node2/TASK: Assert that the port2 profile is not activated [0affcd87-79f5-fddd-f6c7-0000000000c3] 16142 1727204135.66617: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c3 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204135.66763: no more pending results, returning what we have 16142 1727204135.66771: results queue empty 16142 1727204135.66772: checking for any_errors_fatal 16142 1727204135.66778: done checking for any_errors_fatal 16142 1727204135.66779: checking for max_fail_percentage 16142 1727204135.66781: done checking for max_fail_percentage 16142 1727204135.66782: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.66783: done checking to see if all hosts have failed 16142 1727204135.66783: getting the remaining hosts for this loop 16142 1727204135.66785: done getting the remaining hosts for this loop 16142 1727204135.66789: getting the next task for host managed-node2 16142 1727204135.66796: done getting next task for host managed-node2 16142 1727204135.66799: ^ task is: TASK: Get the port1 device state 16142 1727204135.66801: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.66804: getting variables 16142 1727204135.66806: in VariableManager get_vars() 16142 1727204135.66868: Calling all_inventory to load vars for managed-node2 16142 1727204135.66872: Calling groups_inventory to load vars for managed-node2 16142 1727204135.66882: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.66893: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.66896: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.66900: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.67907: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c3 16142 1727204135.67911: WORKER PROCESS EXITING 16142 1727204135.68784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.70472: done with get_vars() 16142 1727204135.70499: done getting variables 16142 1727204135.70561: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.109) 0:00:34.882 ***** 16142 1727204135.70593: entering _queue_task() for managed-node2/command 16142 1727204135.70925: worker is 1 (out of 1 available) 16142 1727204135.70937: exiting _queue_task() for managed-node2/command 16142 1727204135.70950: done queuing things up, now waiting for results queue to drain 16142 1727204135.70952: waiting for pending results... 16142 1727204135.71246: running TaskExecutor() for managed-node2/TASK: Get the port1 device state 16142 1727204135.71355: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000c4 16142 1727204135.71376: variable 'ansible_search_path' from source: unknown 16142 1727204135.71428: calling self._execute() 16142 1727204135.71542: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.71554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.71571: variable 'omit' from source: magic vars 16142 1727204135.71966: variable 'ansible_distribution_major_version' from source: facts 16142 1727204135.71984: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204135.72112: variable 'network_provider' from source: set_fact 16142 1727204135.72123: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204135.72129: when evaluation is False, skipping this task 16142 1727204135.72136: _execute() done 16142 1727204135.72144: dumping result to json 16142 1727204135.72153: done dumping result, returning 16142 1727204135.72168: done running TaskExecutor() for managed-node2/TASK: Get the port1 device state [0affcd87-79f5-fddd-f6c7-0000000000c4] 16142 1727204135.72181: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c4 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204135.72337: no more pending results, returning what we have 16142 1727204135.72342: results queue empty 16142 1727204135.72343: checking for any_errors_fatal 16142 1727204135.72349: done checking for any_errors_fatal 16142 1727204135.72350: checking for max_fail_percentage 16142 1727204135.72353: done checking for max_fail_percentage 16142 1727204135.72354: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.72355: done checking to see if all hosts have failed 16142 1727204135.72355: getting the remaining hosts for this loop 16142 1727204135.72357: done getting the remaining hosts for this loop 16142 1727204135.72361: getting the next task for host managed-node2 16142 1727204135.72369: done getting next task for host managed-node2 16142 1727204135.72371: ^ task is: TASK: Get the port2 device state 16142 1727204135.72374: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.72378: getting variables 16142 1727204135.72380: in VariableManager get_vars() 16142 1727204135.72439: Calling all_inventory to load vars for managed-node2 16142 1727204135.72443: Calling groups_inventory to load vars for managed-node2 16142 1727204135.72445: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.72459: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.72461: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.72466: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.73509: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c4 16142 1727204135.73512: WORKER PROCESS EXITING 16142 1727204135.74211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.76057: done with get_vars() 16142 1727204135.76084: done getting variables 16142 1727204135.76148: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.055) 0:00:34.938 ***** 16142 1727204135.76180: entering _queue_task() for managed-node2/command 16142 1727204135.76529: worker is 1 (out of 1 available) 16142 1727204135.76542: exiting _queue_task() for managed-node2/command 16142 1727204135.76555: done queuing things up, now waiting for results queue to drain 16142 1727204135.76556: waiting for pending results... 16142 1727204135.76848: running TaskExecutor() for managed-node2/TASK: Get the port2 device state 16142 1727204135.76917: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000c5 16142 1727204135.76928: variable 'ansible_search_path' from source: unknown 16142 1727204135.76961: calling self._execute() 16142 1727204135.77045: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.77050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.77059: variable 'omit' from source: magic vars 16142 1727204135.77347: variable 'ansible_distribution_major_version' from source: facts 16142 1727204135.77357: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204135.77439: variable 'network_provider' from source: set_fact 16142 1727204135.77445: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204135.77448: when evaluation is False, skipping this task 16142 1727204135.77450: _execute() done 16142 1727204135.77453: dumping result to json 16142 1727204135.77457: done dumping result, returning 16142 1727204135.77463: done running TaskExecutor() for managed-node2/TASK: Get the port2 device state [0affcd87-79f5-fddd-f6c7-0000000000c5] 16142 1727204135.77471: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c5 16142 1727204135.77570: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c5 16142 1727204135.77580: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204135.77636: no more pending results, returning what we have 16142 1727204135.77641: results queue empty 16142 1727204135.77642: checking for any_errors_fatal 16142 1727204135.77650: done checking for any_errors_fatal 16142 1727204135.77651: checking for max_fail_percentage 16142 1727204135.77653: done checking for max_fail_percentage 16142 1727204135.77654: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.77655: done checking to see if all hosts have failed 16142 1727204135.77656: getting the remaining hosts for this loop 16142 1727204135.77658: done getting the remaining hosts for this loop 16142 1727204135.77662: getting the next task for host managed-node2 16142 1727204135.77670: done getting next task for host managed-node2 16142 1727204135.77673: ^ task is: TASK: Assert that the port1 device is in DOWN state 16142 1727204135.77675: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.77679: getting variables 16142 1727204135.77681: in VariableManager get_vars() 16142 1727204135.77727: Calling all_inventory to load vars for managed-node2 16142 1727204135.77730: Calling groups_inventory to load vars for managed-node2 16142 1727204135.77732: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.77744: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.77746: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.77748: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.78676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.80137: done with get_vars() 16142 1727204135.80160: done getting variables 16142 1727204135.80209: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.040) 0:00:34.979 ***** 16142 1727204135.80232: entering _queue_task() for managed-node2/assert 16142 1727204135.80482: worker is 1 (out of 1 available) 16142 1727204135.80496: exiting _queue_task() for managed-node2/assert 16142 1727204135.80509: done queuing things up, now waiting for results queue to drain 16142 1727204135.80510: waiting for pending results... 16142 1727204135.80692: running TaskExecutor() for managed-node2/TASK: Assert that the port1 device is in DOWN state 16142 1727204135.80759: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000c6 16142 1727204135.80773: variable 'ansible_search_path' from source: unknown 16142 1727204135.80805: calling self._execute() 16142 1727204135.80893: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.80897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.80906: variable 'omit' from source: magic vars 16142 1727204135.81199: variable 'ansible_distribution_major_version' from source: facts 16142 1727204135.81210: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204135.81297: variable 'network_provider' from source: set_fact 16142 1727204135.81301: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204135.81304: when evaluation is False, skipping this task 16142 1727204135.81308: _execute() done 16142 1727204135.81312: dumping result to json 16142 1727204135.81316: done dumping result, returning 16142 1727204135.81322: done running TaskExecutor() for managed-node2/TASK: Assert that the port1 device is in DOWN state [0affcd87-79f5-fddd-f6c7-0000000000c6] 16142 1727204135.81329: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c6 16142 1727204135.81422: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c6 16142 1727204135.81425: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204135.81479: no more pending results, returning what we have 16142 1727204135.81483: results queue empty 16142 1727204135.81483: checking for any_errors_fatal 16142 1727204135.81488: done checking for any_errors_fatal 16142 1727204135.81488: checking for max_fail_percentage 16142 1727204135.81491: done checking for max_fail_percentage 16142 1727204135.81492: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.81493: done checking to see if all hosts have failed 16142 1727204135.81493: getting the remaining hosts for this loop 16142 1727204135.81495: done getting the remaining hosts for this loop 16142 1727204135.81499: getting the next task for host managed-node2 16142 1727204135.81505: done getting next task for host managed-node2 16142 1727204135.81507: ^ task is: TASK: Assert that the port2 device is in DOWN state 16142 1727204135.81510: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.81514: getting variables 16142 1727204135.81516: in VariableManager get_vars() 16142 1727204135.81606: Calling all_inventory to load vars for managed-node2 16142 1727204135.81609: Calling groups_inventory to load vars for managed-node2 16142 1727204135.81611: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.81622: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.81624: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.81627: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.83341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.85010: done with get_vars() 16142 1727204135.85044: done getting variables 16142 1727204135.85110: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.049) 0:00:35.028 ***** 16142 1727204135.85146: entering _queue_task() for managed-node2/assert 16142 1727204135.85500: worker is 1 (out of 1 available) 16142 1727204135.85514: exiting _queue_task() for managed-node2/assert 16142 1727204135.85526: done queuing things up, now waiting for results queue to drain 16142 1727204135.85527: waiting for pending results... 16142 1727204135.85853: running TaskExecutor() for managed-node2/TASK: Assert that the port2 device is in DOWN state 16142 1727204135.85945: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000c7 16142 1727204135.85959: variable 'ansible_search_path' from source: unknown 16142 1727204135.86003: calling self._execute() 16142 1727204135.86113: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.86118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.86137: variable 'omit' from source: magic vars 16142 1727204135.86531: variable 'ansible_distribution_major_version' from source: facts 16142 1727204135.86541: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204135.86661: variable 'network_provider' from source: set_fact 16142 1727204135.86671: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204135.86676: when evaluation is False, skipping this task 16142 1727204135.86679: _execute() done 16142 1727204135.86682: dumping result to json 16142 1727204135.86687: done dumping result, returning 16142 1727204135.86694: done running TaskExecutor() for managed-node2/TASK: Assert that the port2 device is in DOWN state [0affcd87-79f5-fddd-f6c7-0000000000c7] 16142 1727204135.86701: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c7 16142 1727204135.86806: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000c7 16142 1727204135.86809: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204135.86885: no more pending results, returning what we have 16142 1727204135.86890: results queue empty 16142 1727204135.86891: checking for any_errors_fatal 16142 1727204135.86899: done checking for any_errors_fatal 16142 1727204135.86900: checking for max_fail_percentage 16142 1727204135.86902: done checking for max_fail_percentage 16142 1727204135.86904: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.86905: done checking to see if all hosts have failed 16142 1727204135.86905: getting the remaining hosts for this loop 16142 1727204135.86907: done getting the remaining hosts for this loop 16142 1727204135.86911: getting the next task for host managed-node2 16142 1727204135.86920: done getting next task for host managed-node2 16142 1727204135.86927: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204135.86930: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.86954: getting variables 16142 1727204135.86957: in VariableManager get_vars() 16142 1727204135.87022: Calling all_inventory to load vars for managed-node2 16142 1727204135.87026: Calling groups_inventory to load vars for managed-node2 16142 1727204135.87028: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.87040: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.87043: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.87046: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.88647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.90379: done with get_vars() 16142 1727204135.90411: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.053) 0:00:35.081 ***** 16142 1727204135.90511: entering _queue_task() for managed-node2/include_tasks 16142 1727204135.90856: worker is 1 (out of 1 available) 16142 1727204135.90870: exiting _queue_task() for managed-node2/include_tasks 16142 1727204135.90882: done queuing things up, now waiting for results queue to drain 16142 1727204135.90883: waiting for pending results... 16142 1727204135.91183: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204135.91312: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000cf 16142 1727204135.91329: variable 'ansible_search_path' from source: unknown 16142 1727204135.91336: variable 'ansible_search_path' from source: unknown 16142 1727204135.91371: calling self._execute() 16142 1727204135.91474: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204135.91480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204135.91489: variable 'omit' from source: magic vars 16142 1727204135.91900: variable 'ansible_distribution_major_version' from source: facts 16142 1727204135.91913: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204135.91920: _execute() done 16142 1727204135.91923: dumping result to json 16142 1727204135.91927: done dumping result, returning 16142 1727204135.91940: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-fddd-f6c7-0000000000cf] 16142 1727204135.91947: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000cf 16142 1727204135.92101: no more pending results, returning what we have 16142 1727204135.92107: in VariableManager get_vars() 16142 1727204135.92176: Calling all_inventory to load vars for managed-node2 16142 1727204135.92180: Calling groups_inventory to load vars for managed-node2 16142 1727204135.92182: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.92196: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.92199: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.92203: Calling groups_plugins_play to load vars for managed-node2 16142 1727204135.92803: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000cf 16142 1727204135.92807: WORKER PROCESS EXITING 16142 1727204135.94005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204135.95686: done with get_vars() 16142 1727204135.95709: variable 'ansible_search_path' from source: unknown 16142 1727204135.95710: variable 'ansible_search_path' from source: unknown 16142 1727204135.95753: we have included files to process 16142 1727204135.95754: generating all_blocks data 16142 1727204135.95756: done generating all_blocks data 16142 1727204135.95761: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204135.95762: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204135.95766: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204135.96606: done processing included file 16142 1727204135.96608: iterating over new_blocks loaded from include file 16142 1727204135.96610: in VariableManager get_vars() 16142 1727204135.96645: done with get_vars() 16142 1727204135.96646: filtering new block on tags 16142 1727204135.96666: done filtering new block on tags 16142 1727204135.96669: in VariableManager get_vars() 16142 1727204135.96818: done with get_vars() 16142 1727204135.96820: filtering new block on tags 16142 1727204135.96841: done filtering new block on tags 16142 1727204135.96844: in VariableManager get_vars() 16142 1727204135.96877: done with get_vars() 16142 1727204135.96879: filtering new block on tags 16142 1727204135.96897: done filtering new block on tags 16142 1727204135.96899: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 16142 1727204135.96904: extending task lists for all hosts with included blocks 16142 1727204135.98824: done extending task lists 16142 1727204135.98825: done processing included files 16142 1727204135.98826: results queue empty 16142 1727204135.98827: checking for any_errors_fatal 16142 1727204135.98830: done checking for any_errors_fatal 16142 1727204135.98831: checking for max_fail_percentage 16142 1727204135.98833: done checking for max_fail_percentage 16142 1727204135.98833: checking to see if all hosts have failed and the running result is not ok 16142 1727204135.98834: done checking to see if all hosts have failed 16142 1727204135.98835: getting the remaining hosts for this loop 16142 1727204135.98836: done getting the remaining hosts for this loop 16142 1727204135.98839: getting the next task for host managed-node2 16142 1727204135.98843: done getting next task for host managed-node2 16142 1727204135.98846: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204135.98849: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204135.98862: getting variables 16142 1727204135.98863: in VariableManager get_vars() 16142 1727204135.99005: Calling all_inventory to load vars for managed-node2 16142 1727204135.99008: Calling groups_inventory to load vars for managed-node2 16142 1727204135.99011: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204135.99017: Calling all_plugins_play to load vars for managed-node2 16142 1727204135.99020: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204135.99023: Calling groups_plugins_play to load vars for managed-node2 16142 1727204136.01590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204136.04770: done with get_vars() 16142 1727204136.04795: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.143) 0:00:35.225 ***** 16142 1727204136.04884: entering _queue_task() for managed-node2/setup 16142 1727204136.05230: worker is 1 (out of 1 available) 16142 1727204136.05243: exiting _queue_task() for managed-node2/setup 16142 1727204136.05255: done queuing things up, now waiting for results queue to drain 16142 1727204136.05256: waiting for pending results... 16142 1727204136.05569: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204136.05713: in run() - task 0affcd87-79f5-fddd-f6c7-000000000796 16142 1727204136.05728: variable 'ansible_search_path' from source: unknown 16142 1727204136.05737: variable 'ansible_search_path' from source: unknown 16142 1727204136.05799: calling self._execute() 16142 1727204136.05997: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204136.06029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204136.06033: variable 'omit' from source: magic vars 16142 1727204136.07271: variable 'ansible_distribution_major_version' from source: facts 16142 1727204136.07276: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204136.07515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204136.10217: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204136.10311: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204136.10353: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204136.10395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204136.10425: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204136.10517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204136.10545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204136.10572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204136.10619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204136.10632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204136.10686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204136.10716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204136.10740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204136.10781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204136.10795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204136.10971: variable '__network_required_facts' from source: role '' defaults 16142 1727204136.10980: variable 'ansible_facts' from source: unknown 16142 1727204136.11983: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 16142 1727204136.11987: when evaluation is False, skipping this task 16142 1727204136.11990: _execute() done 16142 1727204136.11993: dumping result to json 16142 1727204136.11995: done dumping result, returning 16142 1727204136.12006: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-fddd-f6c7-000000000796] 16142 1727204136.12011: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000796 16142 1727204136.12111: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000796 16142 1727204136.12114: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204136.12199: no more pending results, returning what we have 16142 1727204136.12204: results queue empty 16142 1727204136.12205: checking for any_errors_fatal 16142 1727204136.12207: done checking for any_errors_fatal 16142 1727204136.12208: checking for max_fail_percentage 16142 1727204136.12211: done checking for max_fail_percentage 16142 1727204136.12212: checking to see if all hosts have failed and the running result is not ok 16142 1727204136.12213: done checking to see if all hosts have failed 16142 1727204136.12214: getting the remaining hosts for this loop 16142 1727204136.12216: done getting the remaining hosts for this loop 16142 1727204136.12221: getting the next task for host managed-node2 16142 1727204136.12231: done getting next task for host managed-node2 16142 1727204136.12236: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204136.12239: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204136.12267: getting variables 16142 1727204136.12269: in VariableManager get_vars() 16142 1727204136.12327: Calling all_inventory to load vars for managed-node2 16142 1727204136.12330: Calling groups_inventory to load vars for managed-node2 16142 1727204136.12332: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204136.12343: Calling all_plugins_play to load vars for managed-node2 16142 1727204136.12346: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204136.12349: Calling groups_plugins_play to load vars for managed-node2 16142 1727204136.14131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204136.15067: done with get_vars() 16142 1727204136.15089: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.102) 0:00:35.328 ***** 16142 1727204136.15170: entering _queue_task() for managed-node2/stat 16142 1727204136.15408: worker is 1 (out of 1 available) 16142 1727204136.15424: exiting _queue_task() for managed-node2/stat 16142 1727204136.15437: done queuing things up, now waiting for results queue to drain 16142 1727204136.15438: waiting for pending results... 16142 1727204136.15632: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204136.15762: in run() - task 0affcd87-79f5-fddd-f6c7-000000000798 16142 1727204136.15784: variable 'ansible_search_path' from source: unknown 16142 1727204136.15804: variable 'ansible_search_path' from source: unknown 16142 1727204136.15945: calling self._execute() 16142 1727204136.15948: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204136.16097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204136.16101: variable 'omit' from source: magic vars 16142 1727204136.16358: variable 'ansible_distribution_major_version' from source: facts 16142 1727204136.16374: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204136.16540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204136.16820: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204136.16871: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204136.16906: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204136.16944: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204136.17030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204136.17052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204136.17081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204136.17107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204136.17213: variable '__network_is_ostree' from source: set_fact 16142 1727204136.17224: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204136.17237: when evaluation is False, skipping this task 16142 1727204136.17244: _execute() done 16142 1727204136.17247: dumping result to json 16142 1727204136.17259: done dumping result, returning 16142 1727204136.17283: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-fddd-f6c7-000000000798] 16142 1727204136.17300: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000798 16142 1727204136.17401: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000798 16142 1727204136.17405: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204136.17456: no more pending results, returning what we have 16142 1727204136.17460: results queue empty 16142 1727204136.17461: checking for any_errors_fatal 16142 1727204136.17470: done checking for any_errors_fatal 16142 1727204136.17470: checking for max_fail_percentage 16142 1727204136.17472: done checking for max_fail_percentage 16142 1727204136.17473: checking to see if all hosts have failed and the running result is not ok 16142 1727204136.17474: done checking to see if all hosts have failed 16142 1727204136.17475: getting the remaining hosts for this loop 16142 1727204136.17477: done getting the remaining hosts for this loop 16142 1727204136.17481: getting the next task for host managed-node2 16142 1727204136.17487: done getting next task for host managed-node2 16142 1727204136.17491: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204136.17495: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204136.17519: getting variables 16142 1727204136.17521: in VariableManager get_vars() 16142 1727204136.17577: Calling all_inventory to load vars for managed-node2 16142 1727204136.17580: Calling groups_inventory to load vars for managed-node2 16142 1727204136.17582: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204136.17591: Calling all_plugins_play to load vars for managed-node2 16142 1727204136.17593: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204136.17596: Calling groups_plugins_play to load vars for managed-node2 16142 1727204136.18565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204136.19895: done with get_vars() 16142 1727204136.19928: done getting variables 16142 1727204136.19995: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.048) 0:00:35.377 ***** 16142 1727204136.20042: entering _queue_task() for managed-node2/set_fact 16142 1727204136.20419: worker is 1 (out of 1 available) 16142 1727204136.20433: exiting _queue_task() for managed-node2/set_fact 16142 1727204136.20448: done queuing things up, now waiting for results queue to drain 16142 1727204136.20449: waiting for pending results... 16142 1727204136.20882: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204136.21068: in run() - task 0affcd87-79f5-fddd-f6c7-000000000799 16142 1727204136.21078: variable 'ansible_search_path' from source: unknown 16142 1727204136.21082: variable 'ansible_search_path' from source: unknown 16142 1727204136.21112: calling self._execute() 16142 1727204136.21187: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204136.21191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204136.21199: variable 'omit' from source: magic vars 16142 1727204136.21478: variable 'ansible_distribution_major_version' from source: facts 16142 1727204136.21489: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204136.21610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204136.21805: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204136.21840: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204136.21865: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204136.21893: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204136.21957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204136.21977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204136.22002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204136.22018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204136.22083: variable '__network_is_ostree' from source: set_fact 16142 1727204136.22088: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204136.22093: when evaluation is False, skipping this task 16142 1727204136.22096: _execute() done 16142 1727204136.22099: dumping result to json 16142 1727204136.22101: done dumping result, returning 16142 1727204136.22112: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-fddd-f6c7-000000000799] 16142 1727204136.22115: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000799 16142 1727204136.22199: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000799 16142 1727204136.22202: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204136.22259: no more pending results, returning what we have 16142 1727204136.22263: results queue empty 16142 1727204136.22265: checking for any_errors_fatal 16142 1727204136.22271: done checking for any_errors_fatal 16142 1727204136.22272: checking for max_fail_percentage 16142 1727204136.22274: done checking for max_fail_percentage 16142 1727204136.22275: checking to see if all hosts have failed and the running result is not ok 16142 1727204136.22276: done checking to see if all hosts have failed 16142 1727204136.22276: getting the remaining hosts for this loop 16142 1727204136.22278: done getting the remaining hosts for this loop 16142 1727204136.22281: getting the next task for host managed-node2 16142 1727204136.22292: done getting next task for host managed-node2 16142 1727204136.22296: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204136.22300: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204136.22321: getting variables 16142 1727204136.22323: in VariableManager get_vars() 16142 1727204136.22375: Calling all_inventory to load vars for managed-node2 16142 1727204136.22378: Calling groups_inventory to load vars for managed-node2 16142 1727204136.22380: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204136.22389: Calling all_plugins_play to load vars for managed-node2 16142 1727204136.22391: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204136.22393: Calling groups_plugins_play to load vars for managed-node2 16142 1727204136.23728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204136.25337: done with get_vars() 16142 1727204136.25360: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.053) 0:00:35.431 ***** 16142 1727204136.25440: entering _queue_task() for managed-node2/service_facts 16142 1727204136.25688: worker is 1 (out of 1 available) 16142 1727204136.25701: exiting _queue_task() for managed-node2/service_facts 16142 1727204136.25714: done queuing things up, now waiting for results queue to drain 16142 1727204136.25716: waiting for pending results... 16142 1727204136.25911: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204136.26026: in run() - task 0affcd87-79f5-fddd-f6c7-00000000079b 16142 1727204136.26038: variable 'ansible_search_path' from source: unknown 16142 1727204136.26042: variable 'ansible_search_path' from source: unknown 16142 1727204136.26073: calling self._execute() 16142 1727204136.26175: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204136.26181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204136.26192: variable 'omit' from source: magic vars 16142 1727204136.26467: variable 'ansible_distribution_major_version' from source: facts 16142 1727204136.26478: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204136.26485: variable 'omit' from source: magic vars 16142 1727204136.26539: variable 'omit' from source: magic vars 16142 1727204136.26566: variable 'omit' from source: magic vars 16142 1727204136.26601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204136.26636: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204136.26899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204136.26904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204136.26907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204136.26910: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204136.26912: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204136.26915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204136.26917: Set connection var ansible_timeout to 10 16142 1727204136.26919: Set connection var ansible_connection to ssh 16142 1727204136.26922: Set connection var ansible_shell_type to sh 16142 1727204136.26924: Set connection var ansible_shell_executable to /bin/sh 16142 1727204136.26926: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204136.26928: Set connection var ansible_pipelining to False 16142 1727204136.26930: variable 'ansible_shell_executable' from source: unknown 16142 1727204136.26932: variable 'ansible_connection' from source: unknown 16142 1727204136.26937: variable 'ansible_module_compression' from source: unknown 16142 1727204136.26939: variable 'ansible_shell_type' from source: unknown 16142 1727204136.26941: variable 'ansible_shell_executable' from source: unknown 16142 1727204136.26943: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204136.26945: variable 'ansible_pipelining' from source: unknown 16142 1727204136.26947: variable 'ansible_timeout' from source: unknown 16142 1727204136.26949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204136.27381: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204136.27392: variable 'omit' from source: magic vars 16142 1727204136.27401: starting attempt loop 16142 1727204136.27405: running the handler 16142 1727204136.27414: _low_level_execute_command(): starting 16142 1727204136.27421: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204136.28771: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.28775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.28778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.28780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.28845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204136.28851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204136.28897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204136.30557: stdout chunk (state=3): >>>/root <<< 16142 1727204136.30679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204136.31153: stderr chunk (state=3): >>><<< 16142 1727204136.31158: stdout chunk (state=3): >>><<< 16142 1727204136.31187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204136.31201: _low_level_execute_command(): starting 16142 1727204136.31208: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572 `" && echo ansible-tmp-1727204136.3118742-18761-61929101402572="` echo /root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572 `" ) && sleep 0' 16142 1727204136.31844: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204136.31852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204136.31862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204136.31878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.31917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204136.31924: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204136.31939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.31947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204136.31957: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204136.31966: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204136.31973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204136.31983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204136.31995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.32001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204136.32008: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204136.32018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.32091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204136.32109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204136.32121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204136.32194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204136.34080: stdout chunk (state=3): >>>ansible-tmp-1727204136.3118742-18761-61929101402572=/root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572 <<< 16142 1727204136.34191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204136.34278: stderr chunk (state=3): >>><<< 16142 1727204136.34284: stdout chunk (state=3): >>><<< 16142 1727204136.34307: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204136.3118742-18761-61929101402572=/root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204136.34358: variable 'ansible_module_compression' from source: unknown 16142 1727204136.34405: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 16142 1727204136.34447: variable 'ansible_facts' from source: unknown 16142 1727204136.34525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572/AnsiballZ_service_facts.py 16142 1727204136.34680: Sending initial data 16142 1727204136.34683: Sent initial data (161 bytes) 16142 1727204136.35655: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204136.35666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204136.35677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204136.35691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.35730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204136.35739: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204136.35747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.35761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204136.35777: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204136.35784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204136.35792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204136.35801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204136.35812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.35820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204136.35826: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204136.35838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.35908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204136.35927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204136.35939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204136.36000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204136.37771: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204136.37802: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204136.37841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpe2q70w9c /root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572/AnsiballZ_service_facts.py <<< 16142 1727204136.37868: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204136.38996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204136.39088: stderr chunk (state=3): >>><<< 16142 1727204136.39092: stdout chunk (state=3): >>><<< 16142 1727204136.39113: done transferring module to remote 16142 1727204136.39127: _low_level_execute_command(): starting 16142 1727204136.39133: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572/ /root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572/AnsiballZ_service_facts.py && sleep 0' 16142 1727204136.39851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204136.39859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204136.39879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204136.39892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.39933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204136.39942: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204136.39952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.39967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204136.39980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204136.39987: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204136.39995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204136.40004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204136.40016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.40023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204136.40030: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204136.40042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.40123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204136.40143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204136.40156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204136.40227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204136.42053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204136.42058: stdout chunk (state=3): >>><<< 16142 1727204136.42066: stderr chunk (state=3): >>><<< 16142 1727204136.42088: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204136.42092: _low_level_execute_command(): starting 16142 1727204136.42095: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572/AnsiballZ_service_facts.py && sleep 0' 16142 1727204136.43446: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204136.43450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204136.43525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.43529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204136.43619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204136.43625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204136.43702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204136.43781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204136.43786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204136.43890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204137.74807: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 16142 1727204137.74830: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 16142 1727204137.74847: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.s<<< 16142 1727204137.74865: stdout chunk (state=3): >>>ervice", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hiber<<< 16142 1727204137.74872: stdout chunk (state=3): >>>nate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 16142 1727204137.76147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204137.76234: stderr chunk (state=3): >>><<< 16142 1727204137.76237: stdout chunk (state=3): >>><<< 16142 1727204137.76275: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204137.77853: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204137.77876: _low_level_execute_command(): starting 16142 1727204137.77910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204136.3118742-18761-61929101402572/ > /dev/null 2>&1 && sleep 0' 16142 1727204137.79687: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204137.79704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204137.79725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204137.79747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204137.79800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204137.79877: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204137.79892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204137.79909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204137.79921: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204137.79931: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204137.79942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204137.79954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204137.79976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204137.79988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204137.79998: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204137.80010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204137.80086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204137.80209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204137.80225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204137.80424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204137.82187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204137.82226: stderr chunk (state=3): >>><<< 16142 1727204137.82229: stdout chunk (state=3): >>><<< 16142 1727204137.82474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204137.82477: handler run complete 16142 1727204137.82480: variable 'ansible_facts' from source: unknown 16142 1727204137.82608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204137.83167: variable 'ansible_facts' from source: unknown 16142 1727204137.83454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204137.83757: attempt loop complete, returning result 16142 1727204137.83892: _execute() done 16142 1727204137.83900: dumping result to json 16142 1727204137.83959: done dumping result, returning 16142 1727204137.84005: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-fddd-f6c7-00000000079b] 16142 1727204137.84017: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000079b ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204137.85651: no more pending results, returning what we have 16142 1727204137.85655: results queue empty 16142 1727204137.85656: checking for any_errors_fatal 16142 1727204137.85660: done checking for any_errors_fatal 16142 1727204137.85661: checking for max_fail_percentage 16142 1727204137.85663: done checking for max_fail_percentage 16142 1727204137.85669: checking to see if all hosts have failed and the running result is not ok 16142 1727204137.85670: done checking to see if all hosts have failed 16142 1727204137.85670: getting the remaining hosts for this loop 16142 1727204137.85672: done getting the remaining hosts for this loop 16142 1727204137.85676: getting the next task for host managed-node2 16142 1727204137.85684: done getting next task for host managed-node2 16142 1727204137.85687: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204137.85691: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204137.85705: getting variables 16142 1727204137.85707: in VariableManager get_vars() 16142 1727204137.85755: Calling all_inventory to load vars for managed-node2 16142 1727204137.85758: Calling groups_inventory to load vars for managed-node2 16142 1727204137.85760: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204137.85772: Calling all_plugins_play to load vars for managed-node2 16142 1727204137.85777: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204137.85780: Calling groups_plugins_play to load vars for managed-node2 16142 1727204137.86964: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000079b 16142 1727204137.86979: WORKER PROCESS EXITING 16142 1727204137.88427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204137.93234: done with get_vars() 16142 1727204137.94274: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:37 -0400 (0:00:01.689) 0:00:37.120 ***** 16142 1727204137.94379: entering _queue_task() for managed-node2/package_facts 16142 1727204137.94725: worker is 1 (out of 1 available) 16142 1727204137.94738: exiting _queue_task() for managed-node2/package_facts 16142 1727204137.94750: done queuing things up, now waiting for results queue to drain 16142 1727204137.94752: waiting for pending results... 16142 1727204137.95622: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204137.95774: in run() - task 0affcd87-79f5-fddd-f6c7-00000000079c 16142 1727204137.95789: variable 'ansible_search_path' from source: unknown 16142 1727204137.95793: variable 'ansible_search_path' from source: unknown 16142 1727204137.95827: calling self._execute() 16142 1727204137.96441: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204137.96448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204137.96459: variable 'omit' from source: magic vars 16142 1727204137.96826: variable 'ansible_distribution_major_version' from source: facts 16142 1727204137.96841: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204137.96848: variable 'omit' from source: magic vars 16142 1727204137.97433: variable 'omit' from source: magic vars 16142 1727204137.97471: variable 'omit' from source: magic vars 16142 1727204137.97512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204137.97550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204137.97574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204137.97593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204137.97604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204137.97634: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204137.97640: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204137.97643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204137.97747: Set connection var ansible_timeout to 10 16142 1727204137.97751: Set connection var ansible_connection to ssh 16142 1727204137.97753: Set connection var ansible_shell_type to sh 16142 1727204137.97760: Set connection var ansible_shell_executable to /bin/sh 16142 1727204137.97767: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204137.97775: Set connection var ansible_pipelining to False 16142 1727204137.97798: variable 'ansible_shell_executable' from source: unknown 16142 1727204137.97802: variable 'ansible_connection' from source: unknown 16142 1727204137.97805: variable 'ansible_module_compression' from source: unknown 16142 1727204137.97807: variable 'ansible_shell_type' from source: unknown 16142 1727204137.97810: variable 'ansible_shell_executable' from source: unknown 16142 1727204137.97812: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204137.97814: variable 'ansible_pipelining' from source: unknown 16142 1727204137.97816: variable 'ansible_timeout' from source: unknown 16142 1727204137.97818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204137.98526: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204137.98536: variable 'omit' from source: magic vars 16142 1727204137.98544: starting attempt loop 16142 1727204137.98547: running the handler 16142 1727204137.98561: _low_level_execute_command(): starting 16142 1727204137.98571: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204138.00546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204138.00560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.00577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.00595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.00635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.00647: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204138.00658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.00673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204138.00683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204138.00689: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204138.00697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.00707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.00721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.00728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.00737: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204138.00750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.00826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204138.00850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204138.00862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204138.00939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204138.02573: stdout chunk (state=3): >>>/root <<< 16142 1727204138.02750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204138.02754: stdout chunk (state=3): >>><<< 16142 1727204138.02765: stderr chunk (state=3): >>><<< 16142 1727204138.02787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204138.02802: _low_level_execute_command(): starting 16142 1727204138.02808: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258 `" && echo ansible-tmp-1727204138.0278707-18954-149572923326258="` echo /root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258 `" ) && sleep 0' 16142 1727204138.04157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204138.04780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.04791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.04804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.04848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.04854: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204138.04866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.04880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204138.04886: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204138.04893: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204138.04900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.04909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.04920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.04927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.04933: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204138.04946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.05021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204138.05042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204138.05054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204138.05127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204138.06994: stdout chunk (state=3): >>>ansible-tmp-1727204138.0278707-18954-149572923326258=/root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258 <<< 16142 1727204138.07179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204138.07183: stdout chunk (state=3): >>><<< 16142 1727204138.07190: stderr chunk (state=3): >>><<< 16142 1727204138.07207: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204138.0278707-18954-149572923326258=/root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204138.07259: variable 'ansible_module_compression' from source: unknown 16142 1727204138.07311: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 16142 1727204138.07374: variable 'ansible_facts' from source: unknown 16142 1727204138.07572: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258/AnsiballZ_package_facts.py 16142 1727204138.08091: Sending initial data 16142 1727204138.08094: Sent initial data (162 bytes) 16142 1727204138.10550: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204138.10657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.10662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.10709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204138.10713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204138.10726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.10730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.10746: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.10846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204138.10985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204138.11081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204138.12798: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204138.12803: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204138.12843: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpsoc23osx /root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258/AnsiballZ_package_facts.py <<< 16142 1727204138.12879: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204138.15906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204138.16052: stderr chunk (state=3): >>><<< 16142 1727204138.16055: stdout chunk (state=3): >>><<< 16142 1727204138.16058: done transferring module to remote 16142 1727204138.16060: _low_level_execute_command(): starting 16142 1727204138.16063: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258/ /root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258/AnsiballZ_package_facts.py && sleep 0' 16142 1727204138.17524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204138.17545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.17569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.17595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.17643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.17699: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204138.17715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.17733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204138.17808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204138.17819: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204138.17830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.17846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.17860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.17875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.17885: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204138.17897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.18028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204138.18047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204138.18087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204138.18243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204138.20182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204138.20186: stdout chunk (state=3): >>><<< 16142 1727204138.20188: stderr chunk (state=3): >>><<< 16142 1727204138.20288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204138.20292: _low_level_execute_command(): starting 16142 1727204138.20295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258/AnsiballZ_package_facts.py && sleep 0' 16142 1727204138.22667: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204138.22785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.22801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.22820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.22870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.22887: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204138.22902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.22920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204138.22932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204138.22947: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204138.22960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.22980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.23001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.23013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.23023: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204138.23040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.23119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204138.23243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204138.23259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204138.23456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204138.70941: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 16142 1727204138.71028: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 16142 1727204138.71073: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 16142 1727204138.71078: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 16142 1727204138.71108: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 16142 1727204138.71121: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 16142 1727204138.72649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204138.72653: stdout chunk (state=3): >>><<< 16142 1727204138.72655: stderr chunk (state=3): >>><<< 16142 1727204138.72980: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204138.82675: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204138.82711: _low_level_execute_command(): starting 16142 1727204138.82722: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204138.0278707-18954-149572923326258/ > /dev/null 2>&1 && sleep 0' 16142 1727204138.83475: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204138.83491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.83507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.83530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.83585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.83598: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204138.83614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.83640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204138.83654: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204138.83667: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204138.83682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204138.83696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204138.83712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204138.83724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204138.83744: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204138.83758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204138.83842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204138.83870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204138.83889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204138.83970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204138.85905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204138.85909: stdout chunk (state=3): >>><<< 16142 1727204138.85912: stderr chunk (state=3): >>><<< 16142 1727204138.85970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204138.85973: handler run complete 16142 1727204138.87497: variable 'ansible_facts' from source: unknown 16142 1727204138.88739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204138.93596: variable 'ansible_facts' from source: unknown 16142 1727204138.94711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204138.96443: attempt loop complete, returning result 16142 1727204138.96463: _execute() done 16142 1727204138.96577: dumping result to json 16142 1727204138.97041: done dumping result, returning 16142 1727204138.97050: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-fddd-f6c7-00000000079c] 16142 1727204138.97055: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000079c 16142 1727204139.12913: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000079c 16142 1727204139.12917: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204139.13024: no more pending results, returning what we have 16142 1727204139.13027: results queue empty 16142 1727204139.13028: checking for any_errors_fatal 16142 1727204139.13032: done checking for any_errors_fatal 16142 1727204139.13032: checking for max_fail_percentage 16142 1727204139.13033: done checking for max_fail_percentage 16142 1727204139.13037: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.13038: done checking to see if all hosts have failed 16142 1727204139.13038: getting the remaining hosts for this loop 16142 1727204139.13039: done getting the remaining hosts for this loop 16142 1727204139.13042: getting the next task for host managed-node2 16142 1727204139.13047: done getting next task for host managed-node2 16142 1727204139.13049: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204139.13051: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.13062: getting variables 16142 1727204139.13063: in VariableManager get_vars() 16142 1727204139.13102: Calling all_inventory to load vars for managed-node2 16142 1727204139.13105: Calling groups_inventory to load vars for managed-node2 16142 1727204139.13107: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.13113: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.13116: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.13119: Calling groups_plugins_play to load vars for managed-node2 16142 1727204139.14508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204139.17082: done with get_vars() 16142 1727204139.17122: done getting variables 16142 1727204139.17177: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:39 -0400 (0:00:01.228) 0:00:38.349 ***** 16142 1727204139.17217: entering _queue_task() for managed-node2/debug 16142 1727204139.17576: worker is 1 (out of 1 available) 16142 1727204139.17589: exiting _queue_task() for managed-node2/debug 16142 1727204139.17600: done queuing things up, now waiting for results queue to drain 16142 1727204139.17601: waiting for pending results... 16142 1727204139.17920: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204139.18098: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d0 16142 1727204139.18118: variable 'ansible_search_path' from source: unknown 16142 1727204139.18125: variable 'ansible_search_path' from source: unknown 16142 1727204139.18179: calling self._execute() 16142 1727204139.18296: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.18307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.18320: variable 'omit' from source: magic vars 16142 1727204139.18776: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.18798: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204139.18911: variable 'omit' from source: magic vars 16142 1727204139.18986: variable 'omit' from source: magic vars 16142 1727204139.19099: variable 'network_provider' from source: set_fact 16142 1727204139.19121: variable 'omit' from source: magic vars 16142 1727204139.19172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204139.19217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204139.19248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204139.19272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204139.19287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204139.19327: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204139.19338: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.19351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.19576: Set connection var ansible_timeout to 10 16142 1727204139.19584: Set connection var ansible_connection to ssh 16142 1727204139.19645: Set connection var ansible_shell_type to sh 16142 1727204139.19656: Set connection var ansible_shell_executable to /bin/sh 16142 1727204139.19670: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204139.19682: Set connection var ansible_pipelining to False 16142 1727204139.19709: variable 'ansible_shell_executable' from source: unknown 16142 1727204139.19750: variable 'ansible_connection' from source: unknown 16142 1727204139.19757: variable 'ansible_module_compression' from source: unknown 16142 1727204139.19785: variable 'ansible_shell_type' from source: unknown 16142 1727204139.19792: variable 'ansible_shell_executable' from source: unknown 16142 1727204139.19799: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.19807: variable 'ansible_pipelining' from source: unknown 16142 1727204139.19858: variable 'ansible_timeout' from source: unknown 16142 1727204139.19868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.20138: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204139.20220: variable 'omit' from source: magic vars 16142 1727204139.20230: starting attempt loop 16142 1727204139.20239: running the handler 16142 1727204139.20332: handler run complete 16142 1727204139.20409: attempt loop complete, returning result 16142 1727204139.20417: _execute() done 16142 1727204139.20424: dumping result to json 16142 1727204139.20436: done dumping result, returning 16142 1727204139.20448: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-fddd-f6c7-0000000000d0] 16142 1727204139.20457: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d0 ok: [managed-node2] => {} MSG: Using network provider: nm 16142 1727204139.20671: no more pending results, returning what we have 16142 1727204139.20675: results queue empty 16142 1727204139.20676: checking for any_errors_fatal 16142 1727204139.20690: done checking for any_errors_fatal 16142 1727204139.20691: checking for max_fail_percentage 16142 1727204139.20693: done checking for max_fail_percentage 16142 1727204139.20694: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.20695: done checking to see if all hosts have failed 16142 1727204139.20696: getting the remaining hosts for this loop 16142 1727204139.20697: done getting the remaining hosts for this loop 16142 1727204139.20701: getting the next task for host managed-node2 16142 1727204139.20708: done getting next task for host managed-node2 16142 1727204139.20712: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204139.20715: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.20729: getting variables 16142 1727204139.20731: in VariableManager get_vars() 16142 1727204139.20794: Calling all_inventory to load vars for managed-node2 16142 1727204139.20797: Calling groups_inventory to load vars for managed-node2 16142 1727204139.20800: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.20810: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.20814: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.20817: Calling groups_plugins_play to load vars for managed-node2 16142 1727204139.21932: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d0 16142 1727204139.21939: WORKER PROCESS EXITING 16142 1727204139.23132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204139.24950: done with get_vars() 16142 1727204139.24987: done getting variables 16142 1727204139.25053: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.078) 0:00:38.427 ***** 16142 1727204139.25100: entering _queue_task() for managed-node2/fail 16142 1727204139.25461: worker is 1 (out of 1 available) 16142 1727204139.25477: exiting _queue_task() for managed-node2/fail 16142 1727204139.25488: done queuing things up, now waiting for results queue to drain 16142 1727204139.25489: waiting for pending results... 16142 1727204139.25806: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204139.25971: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d1 16142 1727204139.25991: variable 'ansible_search_path' from source: unknown 16142 1727204139.25998: variable 'ansible_search_path' from source: unknown 16142 1727204139.26046: calling self._execute() 16142 1727204139.26167: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.26183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.26198: variable 'omit' from source: magic vars 16142 1727204139.26739: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.26762: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204139.26908: variable 'network_state' from source: role '' defaults 16142 1727204139.26929: Evaluated conditional (network_state != {}): False 16142 1727204139.26941: when evaluation is False, skipping this task 16142 1727204139.26948: _execute() done 16142 1727204139.26956: dumping result to json 16142 1727204139.26966: done dumping result, returning 16142 1727204139.26978: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-fddd-f6c7-0000000000d1] 16142 1727204139.26989: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d1 16142 1727204139.27124: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d1 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204139.27180: no more pending results, returning what we have 16142 1727204139.27184: results queue empty 16142 1727204139.27185: checking for any_errors_fatal 16142 1727204139.27193: done checking for any_errors_fatal 16142 1727204139.27194: checking for max_fail_percentage 16142 1727204139.27196: done checking for max_fail_percentage 16142 1727204139.27197: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.27198: done checking to see if all hosts have failed 16142 1727204139.27199: getting the remaining hosts for this loop 16142 1727204139.27201: done getting the remaining hosts for this loop 16142 1727204139.27204: getting the next task for host managed-node2 16142 1727204139.27213: done getting next task for host managed-node2 16142 1727204139.27219: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204139.27222: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.27248: getting variables 16142 1727204139.27251: in VariableManager get_vars() 16142 1727204139.27317: Calling all_inventory to load vars for managed-node2 16142 1727204139.27320: Calling groups_inventory to load vars for managed-node2 16142 1727204139.27323: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.27337: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.27340: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.27343: Calling groups_plugins_play to load vars for managed-node2 16142 1727204139.28284: WORKER PROCESS EXITING 16142 1727204139.29722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204139.31553: done with get_vars() 16142 1727204139.31587: done getting variables 16142 1727204139.31659: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.065) 0:00:38.493 ***** 16142 1727204139.31699: entering _queue_task() for managed-node2/fail 16142 1727204139.32086: worker is 1 (out of 1 available) 16142 1727204139.32098: exiting _queue_task() for managed-node2/fail 16142 1727204139.32111: done queuing things up, now waiting for results queue to drain 16142 1727204139.32112: waiting for pending results... 16142 1727204139.32425: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204139.32595: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d2 16142 1727204139.32614: variable 'ansible_search_path' from source: unknown 16142 1727204139.32621: variable 'ansible_search_path' from source: unknown 16142 1727204139.32669: calling self._execute() 16142 1727204139.32785: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.32802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.32818: variable 'omit' from source: magic vars 16142 1727204139.33233: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.33256: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204139.33394: variable 'network_state' from source: role '' defaults 16142 1727204139.33409: Evaluated conditional (network_state != {}): False 16142 1727204139.33420: when evaluation is False, skipping this task 16142 1727204139.33427: _execute() done 16142 1727204139.33438: dumping result to json 16142 1727204139.33456: done dumping result, returning 16142 1727204139.33485: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-fddd-f6c7-0000000000d2] 16142 1727204139.33498: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d2 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204139.33712: no more pending results, returning what we have 16142 1727204139.33716: results queue empty 16142 1727204139.33717: checking for any_errors_fatal 16142 1727204139.33724: done checking for any_errors_fatal 16142 1727204139.33724: checking for max_fail_percentage 16142 1727204139.33727: done checking for max_fail_percentage 16142 1727204139.33728: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.33729: done checking to see if all hosts have failed 16142 1727204139.33730: getting the remaining hosts for this loop 16142 1727204139.33732: done getting the remaining hosts for this loop 16142 1727204139.33738: getting the next task for host managed-node2 16142 1727204139.33746: done getting next task for host managed-node2 16142 1727204139.33750: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204139.33753: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.33778: getting variables 16142 1727204139.33780: in VariableManager get_vars() 16142 1727204139.33839: Calling all_inventory to load vars for managed-node2 16142 1727204139.33842: Calling groups_inventory to load vars for managed-node2 16142 1727204139.33844: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.33856: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.33859: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.33862: Calling groups_plugins_play to load vars for managed-node2 16142 1727204139.34924: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d2 16142 1727204139.34927: WORKER PROCESS EXITING 16142 1727204139.35988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204139.38932: done with get_vars() 16142 1727204139.38969: done getting variables 16142 1727204139.39029: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.073) 0:00:38.567 ***** 16142 1727204139.39099: entering _queue_task() for managed-node2/fail 16142 1727204139.39475: worker is 1 (out of 1 available) 16142 1727204139.39487: exiting _queue_task() for managed-node2/fail 16142 1727204139.39502: done queuing things up, now waiting for results queue to drain 16142 1727204139.39503: waiting for pending results... 16142 1727204139.39808: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204139.39969: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d3 16142 1727204139.39988: variable 'ansible_search_path' from source: unknown 16142 1727204139.39996: variable 'ansible_search_path' from source: unknown 16142 1727204139.40040: calling self._execute() 16142 1727204139.40153: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.40174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.40188: variable 'omit' from source: magic vars 16142 1727204139.40592: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.40618: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204139.40806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204139.43796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204139.43880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204139.43924: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204139.43975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204139.44007: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204139.44100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.44146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.44189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.44235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.44254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.44367: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.44396: Evaluated conditional (ansible_distribution_major_version | int > 9): False 16142 1727204139.44406: when evaluation is False, skipping this task 16142 1727204139.44414: _execute() done 16142 1727204139.44422: dumping result to json 16142 1727204139.44430: done dumping result, returning 16142 1727204139.44443: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-fddd-f6c7-0000000000d3] 16142 1727204139.44453: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d3 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 16142 1727204139.44611: no more pending results, returning what we have 16142 1727204139.44616: results queue empty 16142 1727204139.44617: checking for any_errors_fatal 16142 1727204139.44624: done checking for any_errors_fatal 16142 1727204139.44625: checking for max_fail_percentage 16142 1727204139.44627: done checking for max_fail_percentage 16142 1727204139.44628: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.44629: done checking to see if all hosts have failed 16142 1727204139.44630: getting the remaining hosts for this loop 16142 1727204139.44632: done getting the remaining hosts for this loop 16142 1727204139.44636: getting the next task for host managed-node2 16142 1727204139.44644: done getting next task for host managed-node2 16142 1727204139.44648: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204139.44651: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.44675: getting variables 16142 1727204139.44677: in VariableManager get_vars() 16142 1727204139.44735: Calling all_inventory to load vars for managed-node2 16142 1727204139.44737: Calling groups_inventory to load vars for managed-node2 16142 1727204139.44740: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.44751: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.44753: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.44756: Calling groups_plugins_play to load vars for managed-node2 16142 1727204139.45824: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d3 16142 1727204139.45828: WORKER PROCESS EXITING 16142 1727204139.48762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204139.51091: done with get_vars() 16142 1727204139.51119: done getting variables 16142 1727204139.51188: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.121) 0:00:38.689 ***** 16142 1727204139.51226: entering _queue_task() for managed-node2/dnf 16142 1727204139.51954: worker is 1 (out of 1 available) 16142 1727204139.52017: exiting _queue_task() for managed-node2/dnf 16142 1727204139.52047: done queuing things up, now waiting for results queue to drain 16142 1727204139.52051: waiting for pending results... 16142 1727204139.52691: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204139.53107: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d4 16142 1727204139.53190: variable 'ansible_search_path' from source: unknown 16142 1727204139.53203: variable 'ansible_search_path' from source: unknown 16142 1727204139.53266: calling self._execute() 16142 1727204139.53405: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.53422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.53470: variable 'omit' from source: magic vars 16142 1727204139.54346: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.54365: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204139.54593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204139.60243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204139.60341: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204139.60403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204139.60466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204139.60520: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204139.60645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.60713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.60752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.60810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.60832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.61004: variable 'ansible_distribution' from source: facts 16142 1727204139.61008: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.61025: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 16142 1727204139.61150: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204139.61304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.61330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.61358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.61401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.61418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.61464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.61487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.61511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.61558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.61576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.61613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.61657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.61691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.61741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.61769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.62052: variable 'network_connections' from source: task vars 16142 1727204139.62074: variable 'controller_profile' from source: play vars 16142 1727204139.62170: variable 'controller_profile' from source: play vars 16142 1727204139.62179: variable 'controller_device' from source: play vars 16142 1727204139.62256: variable 'controller_device' from source: play vars 16142 1727204139.62291: variable 'port1_profile' from source: play vars 16142 1727204139.62370: variable 'port1_profile' from source: play vars 16142 1727204139.62380: variable 'dhcp_interface1' from source: play vars 16142 1727204139.62443: variable 'dhcp_interface1' from source: play vars 16142 1727204139.62450: variable 'controller_profile' from source: play vars 16142 1727204139.62550: variable 'controller_profile' from source: play vars 16142 1727204139.62554: variable 'port2_profile' from source: play vars 16142 1727204139.62640: variable 'port2_profile' from source: play vars 16142 1727204139.62648: variable 'dhcp_interface2' from source: play vars 16142 1727204139.62715: variable 'dhcp_interface2' from source: play vars 16142 1727204139.62726: variable 'controller_profile' from source: play vars 16142 1727204139.62800: variable 'controller_profile' from source: play vars 16142 1727204139.62912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204139.63125: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204139.63183: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204139.63214: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204139.63248: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204139.63296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204139.63330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204139.63396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.63408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204139.63487: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204139.63776: variable 'network_connections' from source: task vars 16142 1727204139.63780: variable 'controller_profile' from source: play vars 16142 1727204139.63843: variable 'controller_profile' from source: play vars 16142 1727204139.63885: variable 'controller_device' from source: play vars 16142 1727204139.63947: variable 'controller_device' from source: play vars 16142 1727204139.63974: variable 'port1_profile' from source: play vars 16142 1727204139.64054: variable 'port1_profile' from source: play vars 16142 1727204139.64060: variable 'dhcp_interface1' from source: play vars 16142 1727204139.64170: variable 'dhcp_interface1' from source: play vars 16142 1727204139.64183: variable 'controller_profile' from source: play vars 16142 1727204139.64272: variable 'controller_profile' from source: play vars 16142 1727204139.64276: variable 'port2_profile' from source: play vars 16142 1727204139.64334: variable 'port2_profile' from source: play vars 16142 1727204139.64343: variable 'dhcp_interface2' from source: play vars 16142 1727204139.64416: variable 'dhcp_interface2' from source: play vars 16142 1727204139.64421: variable 'controller_profile' from source: play vars 16142 1727204139.64501: variable 'controller_profile' from source: play vars 16142 1727204139.64534: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204139.64540: when evaluation is False, skipping this task 16142 1727204139.64543: _execute() done 16142 1727204139.64546: dumping result to json 16142 1727204139.64551: done dumping result, returning 16142 1727204139.64560: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-0000000000d4] 16142 1727204139.64584: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d4 16142 1727204139.64766: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d4 16142 1727204139.64770: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204139.64849: no more pending results, returning what we have 16142 1727204139.64854: results queue empty 16142 1727204139.64855: checking for any_errors_fatal 16142 1727204139.64862: done checking for any_errors_fatal 16142 1727204139.64863: checking for max_fail_percentage 16142 1727204139.64866: done checking for max_fail_percentage 16142 1727204139.64867: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.64868: done checking to see if all hosts have failed 16142 1727204139.64869: getting the remaining hosts for this loop 16142 1727204139.64871: done getting the remaining hosts for this loop 16142 1727204139.64875: getting the next task for host managed-node2 16142 1727204139.64881: done getting next task for host managed-node2 16142 1727204139.64886: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204139.64888: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.64907: getting variables 16142 1727204139.64909: in VariableManager get_vars() 16142 1727204139.64977: Calling all_inventory to load vars for managed-node2 16142 1727204139.64979: Calling groups_inventory to load vars for managed-node2 16142 1727204139.64982: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.64990: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.64992: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.64994: Calling groups_plugins_play to load vars for managed-node2 16142 1727204139.66597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204139.68740: done with get_vars() 16142 1727204139.68770: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204139.68832: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.176) 0:00:38.865 ***** 16142 1727204139.68859: entering _queue_task() for managed-node2/yum 16142 1727204139.69116: worker is 1 (out of 1 available) 16142 1727204139.69130: exiting _queue_task() for managed-node2/yum 16142 1727204139.69144: done queuing things up, now waiting for results queue to drain 16142 1727204139.69145: waiting for pending results... 16142 1727204139.69383: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204139.69555: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d5 16142 1727204139.69569: variable 'ansible_search_path' from source: unknown 16142 1727204139.69573: variable 'ansible_search_path' from source: unknown 16142 1727204139.69606: calling self._execute() 16142 1727204139.69689: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.69693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.69702: variable 'omit' from source: magic vars 16142 1727204139.70020: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.70052: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204139.70263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204139.72272: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204139.72348: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204139.72396: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204139.72436: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204139.72471: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204139.72557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.72611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.72643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.72694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.72713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.72826: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.72850: Evaluated conditional (ansible_distribution_major_version | int < 8): False 16142 1727204139.72868: when evaluation is False, skipping this task 16142 1727204139.72873: _execute() done 16142 1727204139.72878: dumping result to json 16142 1727204139.72881: done dumping result, returning 16142 1727204139.72890: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-0000000000d5] 16142 1727204139.72895: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d5 16142 1727204139.72988: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d5 16142 1727204139.72991: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 16142 1727204139.73039: no more pending results, returning what we have 16142 1727204139.73043: results queue empty 16142 1727204139.73044: checking for any_errors_fatal 16142 1727204139.73050: done checking for any_errors_fatal 16142 1727204139.73051: checking for max_fail_percentage 16142 1727204139.73053: done checking for max_fail_percentage 16142 1727204139.73054: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.73055: done checking to see if all hosts have failed 16142 1727204139.73055: getting the remaining hosts for this loop 16142 1727204139.73057: done getting the remaining hosts for this loop 16142 1727204139.73061: getting the next task for host managed-node2 16142 1727204139.73073: done getting next task for host managed-node2 16142 1727204139.73078: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204139.73080: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.73101: getting variables 16142 1727204139.73103: in VariableManager get_vars() 16142 1727204139.73158: Calling all_inventory to load vars for managed-node2 16142 1727204139.73161: Calling groups_inventory to load vars for managed-node2 16142 1727204139.73165: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.73174: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.73176: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.73178: Calling groups_plugins_play to load vars for managed-node2 16142 1727204139.74211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204139.75149: done with get_vars() 16142 1727204139.75178: done getting variables 16142 1727204139.75227: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.063) 0:00:38.929 ***** 16142 1727204139.75254: entering _queue_task() for managed-node2/fail 16142 1727204139.75509: worker is 1 (out of 1 available) 16142 1727204139.75523: exiting _queue_task() for managed-node2/fail 16142 1727204139.75536: done queuing things up, now waiting for results queue to drain 16142 1727204139.75538: waiting for pending results... 16142 1727204139.75826: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204139.75981: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d6 16142 1727204139.76007: variable 'ansible_search_path' from source: unknown 16142 1727204139.76017: variable 'ansible_search_path' from source: unknown 16142 1727204139.76059: calling self._execute() 16142 1727204139.76178: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.76189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.76205: variable 'omit' from source: magic vars 16142 1727204139.76606: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.76625: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204139.76756: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204139.76966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204139.79411: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204139.79490: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204139.79533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204139.79578: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204139.79608: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204139.79695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.79742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.79776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.79824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.79843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.79897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.79924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.79955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.80007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.80028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.80075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.80103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.80136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.80182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.80202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.80395: variable 'network_connections' from source: task vars 16142 1727204139.80415: variable 'controller_profile' from source: play vars 16142 1727204139.80501: variable 'controller_profile' from source: play vars 16142 1727204139.80517: variable 'controller_device' from source: play vars 16142 1727204139.80593: variable 'controller_device' from source: play vars 16142 1727204139.80609: variable 'port1_profile' from source: play vars 16142 1727204139.80681: variable 'port1_profile' from source: play vars 16142 1727204139.80694: variable 'dhcp_interface1' from source: play vars 16142 1727204139.80763: variable 'dhcp_interface1' from source: play vars 16142 1727204139.80778: variable 'controller_profile' from source: play vars 16142 1727204139.80844: variable 'controller_profile' from source: play vars 16142 1727204139.80857: variable 'port2_profile' from source: play vars 16142 1727204139.80927: variable 'port2_profile' from source: play vars 16142 1727204139.80941: variable 'dhcp_interface2' from source: play vars 16142 1727204139.81011: variable 'dhcp_interface2' from source: play vars 16142 1727204139.81024: variable 'controller_profile' from source: play vars 16142 1727204139.81092: variable 'controller_profile' from source: play vars 16142 1727204139.81171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204139.81367: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204139.81417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204139.81451: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204139.81486: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204139.81537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204139.81562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204139.81592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.81622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204139.81699: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204139.81941: variable 'network_connections' from source: task vars 16142 1727204139.81952: variable 'controller_profile' from source: play vars 16142 1727204139.82015: variable 'controller_profile' from source: play vars 16142 1727204139.82028: variable 'controller_device' from source: play vars 16142 1727204139.82088: variable 'controller_device' from source: play vars 16142 1727204139.82101: variable 'port1_profile' from source: play vars 16142 1727204139.82157: variable 'port1_profile' from source: play vars 16142 1727204139.82171: variable 'dhcp_interface1' from source: play vars 16142 1727204139.82230: variable 'dhcp_interface1' from source: play vars 16142 1727204139.82241: variable 'controller_profile' from source: play vars 16142 1727204139.82303: variable 'controller_profile' from source: play vars 16142 1727204139.82313: variable 'port2_profile' from source: play vars 16142 1727204139.82371: variable 'port2_profile' from source: play vars 16142 1727204139.82382: variable 'dhcp_interface2' from source: play vars 16142 1727204139.82442: variable 'dhcp_interface2' from source: play vars 16142 1727204139.82453: variable 'controller_profile' from source: play vars 16142 1727204139.82515: variable 'controller_profile' from source: play vars 16142 1727204139.82551: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204139.82559: when evaluation is False, skipping this task 16142 1727204139.82567: _execute() done 16142 1727204139.82575: dumping result to json 16142 1727204139.82581: done dumping result, returning 16142 1727204139.82592: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-0000000000d6] 16142 1727204139.82601: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d6 16142 1727204139.82715: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d6 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204139.82770: no more pending results, returning what we have 16142 1727204139.82774: results queue empty 16142 1727204139.82775: checking for any_errors_fatal 16142 1727204139.82780: done checking for any_errors_fatal 16142 1727204139.82781: checking for max_fail_percentage 16142 1727204139.82783: done checking for max_fail_percentage 16142 1727204139.82784: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.82785: done checking to see if all hosts have failed 16142 1727204139.82785: getting the remaining hosts for this loop 16142 1727204139.82787: done getting the remaining hosts for this loop 16142 1727204139.82791: getting the next task for host managed-node2 16142 1727204139.82797: done getting next task for host managed-node2 16142 1727204139.82801: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 16142 1727204139.82804: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.82826: getting variables 16142 1727204139.82828: in VariableManager get_vars() 16142 1727204139.82885: Calling all_inventory to load vars for managed-node2 16142 1727204139.82888: Calling groups_inventory to load vars for managed-node2 16142 1727204139.82891: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.82901: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.82904: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.82907: Calling groups_plugins_play to load vars for managed-node2 16142 1727204139.84387: WORKER PROCESS EXITING 16142 1727204139.84717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204139.86394: done with get_vars() 16142 1727204139.86427: done getting variables 16142 1727204139.86494: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.112) 0:00:39.042 ***** 16142 1727204139.86532: entering _queue_task() for managed-node2/package 16142 1727204139.86868: worker is 1 (out of 1 available) 16142 1727204139.86882: exiting _queue_task() for managed-node2/package 16142 1727204139.86895: done queuing things up, now waiting for results queue to drain 16142 1727204139.86896: waiting for pending results... 16142 1727204139.87193: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 16142 1727204139.87351: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d7 16142 1727204139.87373: variable 'ansible_search_path' from source: unknown 16142 1727204139.87381: variable 'ansible_search_path' from source: unknown 16142 1727204139.87422: calling self._execute() 16142 1727204139.87522: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204139.87533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204139.87548: variable 'omit' from source: magic vars 16142 1727204139.87943: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.87963: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204139.88183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204139.88460: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204139.88511: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204139.88555: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204139.88644: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204139.88762: variable 'network_packages' from source: role '' defaults 16142 1727204139.88879: variable '__network_provider_setup' from source: role '' defaults 16142 1727204139.88898: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204139.88964: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204139.88982: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204139.89043: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204139.89233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204139.91822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204139.91900: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204139.91945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204139.91982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204139.92016: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204139.92099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.92136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.92167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.92211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.92234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.92282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.92309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.92341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.92387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.92406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.92642: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204139.92766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.92794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.92822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.92868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.92890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.92986: variable 'ansible_python' from source: facts 16142 1727204139.93016: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204139.93105: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204139.93188: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204139.93318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.93345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.93378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.93424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.93444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.93495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204139.93534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204139.93567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.93609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204139.93634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204139.93791: variable 'network_connections' from source: task vars 16142 1727204139.93803: variable 'controller_profile' from source: play vars 16142 1727204139.93908: variable 'controller_profile' from source: play vars 16142 1727204139.93923: variable 'controller_device' from source: play vars 16142 1727204139.94029: variable 'controller_device' from source: play vars 16142 1727204139.94046: variable 'port1_profile' from source: play vars 16142 1727204139.94149: variable 'port1_profile' from source: play vars 16142 1727204139.94168: variable 'dhcp_interface1' from source: play vars 16142 1727204139.94271: variable 'dhcp_interface1' from source: play vars 16142 1727204139.94289: variable 'controller_profile' from source: play vars 16142 1727204139.94396: variable 'controller_profile' from source: play vars 16142 1727204139.94410: variable 'port2_profile' from source: play vars 16142 1727204139.94535: variable 'port2_profile' from source: play vars 16142 1727204139.94540: variable 'dhcp_interface2' from source: play vars 16142 1727204139.94611: variable 'dhcp_interface2' from source: play vars 16142 1727204139.94618: variable 'controller_profile' from source: play vars 16142 1727204139.94695: variable 'controller_profile' from source: play vars 16142 1727204139.94757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204139.94779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204139.94799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204139.94828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204139.94868: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204139.95073: variable 'network_connections' from source: task vars 16142 1727204139.95077: variable 'controller_profile' from source: play vars 16142 1727204139.95150: variable 'controller_profile' from source: play vars 16142 1727204139.95158: variable 'controller_device' from source: play vars 16142 1727204139.95228: variable 'controller_device' from source: play vars 16142 1727204139.95238: variable 'port1_profile' from source: play vars 16142 1727204139.95312: variable 'port1_profile' from source: play vars 16142 1727204139.95319: variable 'dhcp_interface1' from source: play vars 16142 1727204139.95393: variable 'dhcp_interface1' from source: play vars 16142 1727204139.95401: variable 'controller_profile' from source: play vars 16142 1727204139.95471: variable 'controller_profile' from source: play vars 16142 1727204139.95483: variable 'port2_profile' from source: play vars 16142 1727204139.95550: variable 'port2_profile' from source: play vars 16142 1727204139.95558: variable 'dhcp_interface2' from source: play vars 16142 1727204139.95632: variable 'dhcp_interface2' from source: play vars 16142 1727204139.95641: variable 'controller_profile' from source: play vars 16142 1727204139.95713: variable 'controller_profile' from source: play vars 16142 1727204139.95757: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204139.95816: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204139.96021: variable 'network_connections' from source: task vars 16142 1727204139.96025: variable 'controller_profile' from source: play vars 16142 1727204139.96075: variable 'controller_profile' from source: play vars 16142 1727204139.96081: variable 'controller_device' from source: play vars 16142 1727204139.96124: variable 'controller_device' from source: play vars 16142 1727204139.96134: variable 'port1_profile' from source: play vars 16142 1727204139.96183: variable 'port1_profile' from source: play vars 16142 1727204139.96189: variable 'dhcp_interface1' from source: play vars 16142 1727204139.96233: variable 'dhcp_interface1' from source: play vars 16142 1727204139.96241: variable 'controller_profile' from source: play vars 16142 1727204139.96290: variable 'controller_profile' from source: play vars 16142 1727204139.96296: variable 'port2_profile' from source: play vars 16142 1727204139.96347: variable 'port2_profile' from source: play vars 16142 1727204139.96381: variable 'dhcp_interface2' from source: play vars 16142 1727204139.96409: variable 'dhcp_interface2' from source: play vars 16142 1727204139.96415: variable 'controller_profile' from source: play vars 16142 1727204139.96483: variable 'controller_profile' from source: play vars 16142 1727204139.96510: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204139.96588: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204139.96847: variable 'network_connections' from source: task vars 16142 1727204139.96853: variable 'controller_profile' from source: play vars 16142 1727204139.97007: variable 'controller_profile' from source: play vars 16142 1727204139.97010: variable 'controller_device' from source: play vars 16142 1727204139.97013: variable 'controller_device' from source: play vars 16142 1727204139.97015: variable 'port1_profile' from source: play vars 16142 1727204139.97049: variable 'port1_profile' from source: play vars 16142 1727204139.97055: variable 'dhcp_interface1' from source: play vars 16142 1727204139.97116: variable 'dhcp_interface1' from source: play vars 16142 1727204139.97122: variable 'controller_profile' from source: play vars 16142 1727204139.97189: variable 'controller_profile' from source: play vars 16142 1727204139.97203: variable 'port2_profile' from source: play vars 16142 1727204139.97262: variable 'port2_profile' from source: play vars 16142 1727204139.97270: variable 'dhcp_interface2' from source: play vars 16142 1727204139.97330: variable 'dhcp_interface2' from source: play vars 16142 1727204139.97336: variable 'controller_profile' from source: play vars 16142 1727204139.97400: variable 'controller_profile' from source: play vars 16142 1727204139.97470: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204139.97529: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204139.97534: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204139.97594: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204139.97812: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204139.98229: variable 'network_connections' from source: task vars 16142 1727204139.98233: variable 'controller_profile' from source: play vars 16142 1727204139.98279: variable 'controller_profile' from source: play vars 16142 1727204139.98289: variable 'controller_device' from source: play vars 16142 1727204139.98330: variable 'controller_device' from source: play vars 16142 1727204139.98339: variable 'port1_profile' from source: play vars 16142 1727204139.98381: variable 'port1_profile' from source: play vars 16142 1727204139.98387: variable 'dhcp_interface1' from source: play vars 16142 1727204139.98450: variable 'dhcp_interface1' from source: play vars 16142 1727204139.98454: variable 'controller_profile' from source: play vars 16142 1727204139.98498: variable 'controller_profile' from source: play vars 16142 1727204139.98504: variable 'port2_profile' from source: play vars 16142 1727204139.98550: variable 'port2_profile' from source: play vars 16142 1727204139.98555: variable 'dhcp_interface2' from source: play vars 16142 1727204139.98602: variable 'dhcp_interface2' from source: play vars 16142 1727204139.98607: variable 'controller_profile' from source: play vars 16142 1727204139.98652: variable 'controller_profile' from source: play vars 16142 1727204139.98658: variable 'ansible_distribution' from source: facts 16142 1727204139.98661: variable '__network_rh_distros' from source: role '' defaults 16142 1727204139.98671: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.98693: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204139.98805: variable 'ansible_distribution' from source: facts 16142 1727204139.98809: variable '__network_rh_distros' from source: role '' defaults 16142 1727204139.98814: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.98825: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204139.98935: variable 'ansible_distribution' from source: facts 16142 1727204139.98946: variable '__network_rh_distros' from source: role '' defaults 16142 1727204139.98948: variable 'ansible_distribution_major_version' from source: facts 16142 1727204139.98978: variable 'network_provider' from source: set_fact 16142 1727204139.98989: variable 'ansible_facts' from source: unknown 16142 1727204139.99418: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 16142 1727204139.99421: when evaluation is False, skipping this task 16142 1727204139.99425: _execute() done 16142 1727204139.99428: dumping result to json 16142 1727204139.99430: done dumping result, returning 16142 1727204139.99442: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-fddd-f6c7-0000000000d7] 16142 1727204139.99446: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d7 16142 1727204139.99539: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d7 16142 1727204139.99543: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 16142 1727204139.99589: no more pending results, returning what we have 16142 1727204139.99594: results queue empty 16142 1727204139.99594: checking for any_errors_fatal 16142 1727204139.99600: done checking for any_errors_fatal 16142 1727204139.99601: checking for max_fail_percentage 16142 1727204139.99603: done checking for max_fail_percentage 16142 1727204139.99604: checking to see if all hosts have failed and the running result is not ok 16142 1727204139.99605: done checking to see if all hosts have failed 16142 1727204139.99605: getting the remaining hosts for this loop 16142 1727204139.99607: done getting the remaining hosts for this loop 16142 1727204139.99610: getting the next task for host managed-node2 16142 1727204139.99618: done getting next task for host managed-node2 16142 1727204139.99622: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204139.99624: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204139.99644: getting variables 16142 1727204139.99646: in VariableManager get_vars() 16142 1727204139.99702: Calling all_inventory to load vars for managed-node2 16142 1727204139.99704: Calling groups_inventory to load vars for managed-node2 16142 1727204139.99706: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204139.99718: Calling all_plugins_play to load vars for managed-node2 16142 1727204139.99721: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204139.99723: Calling groups_plugins_play to load vars for managed-node2 16142 1727204140.00756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204140.01787: done with get_vars() 16142 1727204140.01810: done getting variables 16142 1727204140.01859: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:40 -0400 (0:00:00.153) 0:00:39.195 ***** 16142 1727204140.01889: entering _queue_task() for managed-node2/package 16142 1727204140.02392: worker is 1 (out of 1 available) 16142 1727204140.02421: exiting _queue_task() for managed-node2/package 16142 1727204140.02447: done queuing things up, now waiting for results queue to drain 16142 1727204140.02450: waiting for pending results... 16142 1727204140.03509: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204140.03581: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d8 16142 1727204140.03611: variable 'ansible_search_path' from source: unknown 16142 1727204140.03684: variable 'ansible_search_path' from source: unknown 16142 1727204140.03689: calling self._execute() 16142 1727204140.03735: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204140.03826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204140.03835: variable 'omit' from source: magic vars 16142 1727204140.04179: variable 'ansible_distribution_major_version' from source: facts 16142 1727204140.04189: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204140.04286: variable 'network_state' from source: role '' defaults 16142 1727204140.04295: Evaluated conditional (network_state != {}): False 16142 1727204140.04299: when evaluation is False, skipping this task 16142 1727204140.04302: _execute() done 16142 1727204140.04304: dumping result to json 16142 1727204140.04308: done dumping result, returning 16142 1727204140.04324: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-0000000000d8] 16142 1727204140.04331: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d8 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204140.04688: no more pending results, returning what we have 16142 1727204140.04693: results queue empty 16142 1727204140.04694: checking for any_errors_fatal 16142 1727204140.04704: done checking for any_errors_fatal 16142 1727204140.04705: checking for max_fail_percentage 16142 1727204140.04707: done checking for max_fail_percentage 16142 1727204140.04708: checking to see if all hosts have failed and the running result is not ok 16142 1727204140.04709: done checking to see if all hosts have failed 16142 1727204140.04714: getting the remaining hosts for this loop 16142 1727204140.04717: done getting the remaining hosts for this loop 16142 1727204140.04725: getting the next task for host managed-node2 16142 1727204140.04738: done getting next task for host managed-node2 16142 1727204140.04742: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204140.04745: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204140.04781: getting variables 16142 1727204140.04785: in VariableManager get_vars() 16142 1727204140.04863: Calling all_inventory to load vars for managed-node2 16142 1727204140.04870: Calling groups_inventory to load vars for managed-node2 16142 1727204140.04873: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204140.04888: Calling all_plugins_play to load vars for managed-node2 16142 1727204140.04892: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204140.04899: Calling groups_plugins_play to load vars for managed-node2 16142 1727204140.05550: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d8 16142 1727204140.05553: WORKER PROCESS EXITING 16142 1727204140.09417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204140.11494: done with get_vars() 16142 1727204140.11538: done getting variables 16142 1727204140.11603: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:40 -0400 (0:00:00.097) 0:00:39.293 ***** 16142 1727204140.11650: entering _queue_task() for managed-node2/package 16142 1727204140.12893: worker is 1 (out of 1 available) 16142 1727204140.12906: exiting _queue_task() for managed-node2/package 16142 1727204140.12920: done queuing things up, now waiting for results queue to drain 16142 1727204140.12921: waiting for pending results... 16142 1727204140.13968: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204140.14099: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000d9 16142 1727204140.14112: variable 'ansible_search_path' from source: unknown 16142 1727204140.14115: variable 'ansible_search_path' from source: unknown 16142 1727204140.14158: calling self._execute() 16142 1727204140.14263: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204140.14273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204140.14285: variable 'omit' from source: magic vars 16142 1727204140.14665: variable 'ansible_distribution_major_version' from source: facts 16142 1727204140.14678: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204140.14800: variable 'network_state' from source: role '' defaults 16142 1727204140.14807: Evaluated conditional (network_state != {}): False 16142 1727204140.14816: when evaluation is False, skipping this task 16142 1727204140.14819: _execute() done 16142 1727204140.14824: dumping result to json 16142 1727204140.14827: done dumping result, returning 16142 1727204140.14835: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-0000000000d9] 16142 1727204140.14846: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d9 16142 1727204140.14953: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000d9 16142 1727204140.14956: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204140.15007: no more pending results, returning what we have 16142 1727204140.15011: results queue empty 16142 1727204140.15012: checking for any_errors_fatal 16142 1727204140.15017: done checking for any_errors_fatal 16142 1727204140.15018: checking for max_fail_percentage 16142 1727204140.15020: done checking for max_fail_percentage 16142 1727204140.15021: checking to see if all hosts have failed and the running result is not ok 16142 1727204140.15021: done checking to see if all hosts have failed 16142 1727204140.15022: getting the remaining hosts for this loop 16142 1727204140.15024: done getting the remaining hosts for this loop 16142 1727204140.15028: getting the next task for host managed-node2 16142 1727204140.15036: done getting next task for host managed-node2 16142 1727204140.15039: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204140.15042: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204140.15070: getting variables 16142 1727204140.15072: in VariableManager get_vars() 16142 1727204140.15122: Calling all_inventory to load vars for managed-node2 16142 1727204140.15124: Calling groups_inventory to load vars for managed-node2 16142 1727204140.15127: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204140.15135: Calling all_plugins_play to load vars for managed-node2 16142 1727204140.15138: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204140.15141: Calling groups_plugins_play to load vars for managed-node2 16142 1727204140.18565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204140.22422: done with get_vars() 16142 1727204140.22461: done getting variables 16142 1727204140.22914: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:40 -0400 (0:00:00.113) 0:00:39.406 ***** 16142 1727204140.22954: entering _queue_task() for managed-node2/service 16142 1727204140.23304: worker is 1 (out of 1 available) 16142 1727204140.23319: exiting _queue_task() for managed-node2/service 16142 1727204140.23334: done queuing things up, now waiting for results queue to drain 16142 1727204140.23335: waiting for pending results... 16142 1727204140.23654: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204140.23825: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000da 16142 1727204140.23845: variable 'ansible_search_path' from source: unknown 16142 1727204140.23852: variable 'ansible_search_path' from source: unknown 16142 1727204140.23897: calling self._execute() 16142 1727204140.24010: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204140.24023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204140.24042: variable 'omit' from source: magic vars 16142 1727204140.24445: variable 'ansible_distribution_major_version' from source: facts 16142 1727204140.24467: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204140.24598: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204140.24817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204140.27292: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204140.27373: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204140.27422: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204140.27465: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204140.27504: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204140.27589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204140.27643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204140.27675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.27728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204140.27748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204140.27805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204140.27840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204140.27873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.27922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204140.27945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204140.28027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204140.28060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204140.28091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.28140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204140.28168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204140.28373: variable 'network_connections' from source: task vars 16142 1727204140.28392: variable 'controller_profile' from source: play vars 16142 1727204140.28478: variable 'controller_profile' from source: play vars 16142 1727204140.28492: variable 'controller_device' from source: play vars 16142 1727204140.28563: variable 'controller_device' from source: play vars 16142 1727204140.28584: variable 'port1_profile' from source: play vars 16142 1727204140.28643: variable 'port1_profile' from source: play vars 16142 1727204140.28655: variable 'dhcp_interface1' from source: play vars 16142 1727204140.28727: variable 'dhcp_interface1' from source: play vars 16142 1727204140.28739: variable 'controller_profile' from source: play vars 16142 1727204140.28810: variable 'controller_profile' from source: play vars 16142 1727204140.28823: variable 'port2_profile' from source: play vars 16142 1727204140.28890: variable 'port2_profile' from source: play vars 16142 1727204140.28908: variable 'dhcp_interface2' from source: play vars 16142 1727204140.28969: variable 'dhcp_interface2' from source: play vars 16142 1727204140.28980: variable 'controller_profile' from source: play vars 16142 1727204140.29047: variable 'controller_profile' from source: play vars 16142 1727204140.29131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204140.29325: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204140.29375: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204140.29410: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204140.29454: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204140.29505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204140.29537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204140.29574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.29605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204140.29694: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204140.29980: variable 'network_connections' from source: task vars 16142 1727204140.29995: variable 'controller_profile' from source: play vars 16142 1727204140.30059: variable 'controller_profile' from source: play vars 16142 1727204140.30090: variable 'controller_device' from source: play vars 16142 1727204140.30182: variable 'controller_device' from source: play vars 16142 1727204140.30222: variable 'port1_profile' from source: play vars 16142 1727204140.30289: variable 'port1_profile' from source: play vars 16142 1727204140.30302: variable 'dhcp_interface1' from source: play vars 16142 1727204140.30382: variable 'dhcp_interface1' from source: play vars 16142 1727204140.30398: variable 'controller_profile' from source: play vars 16142 1727204140.30469: variable 'controller_profile' from source: play vars 16142 1727204140.30481: variable 'port2_profile' from source: play vars 16142 1727204140.30549: variable 'port2_profile' from source: play vars 16142 1727204140.30560: variable 'dhcp_interface2' from source: play vars 16142 1727204140.30627: variable 'dhcp_interface2' from source: play vars 16142 1727204140.30639: variable 'controller_profile' from source: play vars 16142 1727204140.30706: variable 'controller_profile' from source: play vars 16142 1727204140.30749: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204140.30762: when evaluation is False, skipping this task 16142 1727204140.30772: _execute() done 16142 1727204140.30779: dumping result to json 16142 1727204140.30786: done dumping result, returning 16142 1727204140.30798: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-0000000000da] 16142 1727204140.30808: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000da 16142 1727204140.30934: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000da skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204140.30989: no more pending results, returning what we have 16142 1727204140.30994: results queue empty 16142 1727204140.30995: checking for any_errors_fatal 16142 1727204140.31002: done checking for any_errors_fatal 16142 1727204140.31003: checking for max_fail_percentage 16142 1727204140.31005: done checking for max_fail_percentage 16142 1727204140.31006: checking to see if all hosts have failed and the running result is not ok 16142 1727204140.31007: done checking to see if all hosts have failed 16142 1727204140.31008: getting the remaining hosts for this loop 16142 1727204140.31009: done getting the remaining hosts for this loop 16142 1727204140.31014: getting the next task for host managed-node2 16142 1727204140.31022: done getting next task for host managed-node2 16142 1727204140.31026: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204140.31029: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204140.31051: getting variables 16142 1727204140.31054: in VariableManager get_vars() 16142 1727204140.31120: Calling all_inventory to load vars for managed-node2 16142 1727204140.31123: Calling groups_inventory to load vars for managed-node2 16142 1727204140.31126: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204140.31137: Calling all_plugins_play to load vars for managed-node2 16142 1727204140.31141: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204140.31145: Calling groups_plugins_play to load vars for managed-node2 16142 1727204140.31838: WORKER PROCESS EXITING 16142 1727204140.32581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204140.34647: done with get_vars() 16142 1727204140.34679: done getting variables 16142 1727204140.34746: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:40 -0400 (0:00:00.118) 0:00:39.524 ***** 16142 1727204140.34785: entering _queue_task() for managed-node2/service 16142 1727204140.35970: worker is 1 (out of 1 available) 16142 1727204140.35986: exiting _queue_task() for managed-node2/service 16142 1727204140.36001: done queuing things up, now waiting for results queue to drain 16142 1727204140.36002: waiting for pending results... 16142 1727204140.36429: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204140.36551: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000db 16142 1727204140.36567: variable 'ansible_search_path' from source: unknown 16142 1727204140.36571: variable 'ansible_search_path' from source: unknown 16142 1727204140.36614: calling self._execute() 16142 1727204140.36713: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204140.36718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204140.36730: variable 'omit' from source: magic vars 16142 1727204140.37125: variable 'ansible_distribution_major_version' from source: facts 16142 1727204140.37139: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204140.37305: variable 'network_provider' from source: set_fact 16142 1727204140.37309: variable 'network_state' from source: role '' defaults 16142 1727204140.37321: Evaluated conditional (network_provider == "nm" or network_state != {}): True 16142 1727204140.37329: variable 'omit' from source: magic vars 16142 1727204140.37403: variable 'omit' from source: magic vars 16142 1727204140.37428: variable 'network_service_name' from source: role '' defaults 16142 1727204140.37505: variable 'network_service_name' from source: role '' defaults 16142 1727204140.37607: variable '__network_provider_setup' from source: role '' defaults 16142 1727204140.37611: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204140.37670: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204140.37685: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204140.37743: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204140.37976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204140.42406: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204140.42410: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204140.42420: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204140.42460: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204140.42494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204140.42567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204140.42704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204140.43385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.43427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204140.43444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204140.43490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204140.43516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204140.43542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.43583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204140.43596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204140.44418: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204140.44571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204140.44612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204140.44642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.44698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204140.44726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204140.44846: variable 'ansible_python' from source: facts 16142 1727204140.44879: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204140.44991: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204140.45095: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204140.45262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204140.45294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204140.45327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.45387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204140.45405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204140.45473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204140.45514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204140.45546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.45614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204140.45639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204140.45812: variable 'network_connections' from source: task vars 16142 1727204140.45827: variable 'controller_profile' from source: play vars 16142 1727204140.46238: variable 'controller_profile' from source: play vars 16142 1727204140.46258: variable 'controller_device' from source: play vars 16142 1727204140.46344: variable 'controller_device' from source: play vars 16142 1727204140.46361: variable 'port1_profile' from source: play vars 16142 1727204140.46480: variable 'port1_profile' from source: play vars 16142 1727204140.46517: variable 'dhcp_interface1' from source: play vars 16142 1727204140.46632: variable 'dhcp_interface1' from source: play vars 16142 1727204140.46784: variable 'controller_profile' from source: play vars 16142 1727204140.46855: variable 'controller_profile' from source: play vars 16142 1727204140.46997: variable 'port2_profile' from source: play vars 16142 1727204140.47070: variable 'port2_profile' from source: play vars 16142 1727204140.47111: variable 'dhcp_interface2' from source: play vars 16142 1727204140.47243: variable 'dhcp_interface2' from source: play vars 16142 1727204140.47324: variable 'controller_profile' from source: play vars 16142 1727204140.47453: variable 'controller_profile' from source: play vars 16142 1727204140.47736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204140.48422: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204140.48499: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204140.48578: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204140.48634: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204140.48716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204140.48758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204140.48798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204140.48836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204140.48905: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204140.49225: variable 'network_connections' from source: task vars 16142 1727204140.49238: variable 'controller_profile' from source: play vars 16142 1727204140.49339: variable 'controller_profile' from source: play vars 16142 1727204140.49355: variable 'controller_device' from source: play vars 16142 1727204140.49442: variable 'controller_device' from source: play vars 16142 1727204140.49459: variable 'port1_profile' from source: play vars 16142 1727204140.49543: variable 'port1_profile' from source: play vars 16142 1727204140.49558: variable 'dhcp_interface1' from source: play vars 16142 1727204140.49642: variable 'dhcp_interface1' from source: play vars 16142 1727204140.49656: variable 'controller_profile' from source: play vars 16142 1727204140.49742: variable 'controller_profile' from source: play vars 16142 1727204140.49757: variable 'port2_profile' from source: play vars 16142 1727204140.49828: variable 'port2_profile' from source: play vars 16142 1727204140.49852: variable 'dhcp_interface2' from source: play vars 16142 1727204140.49922: variable 'dhcp_interface2' from source: play vars 16142 1727204140.49939: variable 'controller_profile' from source: play vars 16142 1727204140.50021: variable 'controller_profile' from source: play vars 16142 1727204140.50090: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204140.50179: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204140.50495: variable 'network_connections' from source: task vars 16142 1727204140.50511: variable 'controller_profile' from source: play vars 16142 1727204140.50585: variable 'controller_profile' from source: play vars 16142 1727204140.50603: variable 'controller_device' from source: play vars 16142 1727204140.50680: variable 'controller_device' from source: play vars 16142 1727204140.50694: variable 'port1_profile' from source: play vars 16142 1727204140.50776: variable 'port1_profile' from source: play vars 16142 1727204140.50787: variable 'dhcp_interface1' from source: play vars 16142 1727204140.50867: variable 'dhcp_interface1' from source: play vars 16142 1727204140.50879: variable 'controller_profile' from source: play vars 16142 1727204140.50968: variable 'controller_profile' from source: play vars 16142 1727204140.50983: variable 'port2_profile' from source: play vars 16142 1727204140.51059: variable 'port2_profile' from source: play vars 16142 1727204140.51074: variable 'dhcp_interface2' from source: play vars 16142 1727204140.51148: variable 'dhcp_interface2' from source: play vars 16142 1727204140.51168: variable 'controller_profile' from source: play vars 16142 1727204140.51238: variable 'controller_profile' from source: play vars 16142 1727204140.51287: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204140.51380: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204140.51754: variable 'network_connections' from source: task vars 16142 1727204140.51767: variable 'controller_profile' from source: play vars 16142 1727204140.51848: variable 'controller_profile' from source: play vars 16142 1727204140.51859: variable 'controller_device' from source: play vars 16142 1727204140.51948: variable 'controller_device' from source: play vars 16142 1727204140.51961: variable 'port1_profile' from source: play vars 16142 1727204140.52041: variable 'port1_profile' from source: play vars 16142 1727204140.52052: variable 'dhcp_interface1' from source: play vars 16142 1727204140.52122: variable 'dhcp_interface1' from source: play vars 16142 1727204140.52144: variable 'controller_profile' from source: play vars 16142 1727204140.52215: variable 'controller_profile' from source: play vars 16142 1727204140.52226: variable 'port2_profile' from source: play vars 16142 1727204140.52308: variable 'port2_profile' from source: play vars 16142 1727204140.52319: variable 'dhcp_interface2' from source: play vars 16142 1727204140.52403: variable 'dhcp_interface2' from source: play vars 16142 1727204140.52416: variable 'controller_profile' from source: play vars 16142 1727204140.52500: variable 'controller_profile' from source: play vars 16142 1727204140.52586: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204140.52654: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204140.52667: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204140.52741: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204140.52985: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204140.53547: variable 'network_connections' from source: task vars 16142 1727204140.53562: variable 'controller_profile' from source: play vars 16142 1727204140.53625: variable 'controller_profile' from source: play vars 16142 1727204140.53639: variable 'controller_device' from source: play vars 16142 1727204140.53709: variable 'controller_device' from source: play vars 16142 1727204140.53724: variable 'port1_profile' from source: play vars 16142 1727204140.53795: variable 'port1_profile' from source: play vars 16142 1727204140.53806: variable 'dhcp_interface1' from source: play vars 16142 1727204140.53866: variable 'dhcp_interface1' from source: play vars 16142 1727204140.53887: variable 'controller_profile' from source: play vars 16142 1727204140.53949: variable 'controller_profile' from source: play vars 16142 1727204140.53962: variable 'port2_profile' from source: play vars 16142 1727204140.54039: variable 'port2_profile' from source: play vars 16142 1727204140.54051: variable 'dhcp_interface2' from source: play vars 16142 1727204140.54122: variable 'dhcp_interface2' from source: play vars 16142 1727204140.54133: variable 'controller_profile' from source: play vars 16142 1727204140.54200: variable 'controller_profile' from source: play vars 16142 1727204140.54218: variable 'ansible_distribution' from source: facts 16142 1727204140.54226: variable '__network_rh_distros' from source: role '' defaults 16142 1727204140.54237: variable 'ansible_distribution_major_version' from source: facts 16142 1727204140.54269: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204140.54471: variable 'ansible_distribution' from source: facts 16142 1727204140.54480: variable '__network_rh_distros' from source: role '' defaults 16142 1727204140.54490: variable 'ansible_distribution_major_version' from source: facts 16142 1727204140.54508: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204140.54733: variable 'ansible_distribution' from source: facts 16142 1727204140.54755: variable '__network_rh_distros' from source: role '' defaults 16142 1727204140.54769: variable 'ansible_distribution_major_version' from source: facts 16142 1727204140.54810: variable 'network_provider' from source: set_fact 16142 1727204140.54842: variable 'omit' from source: magic vars 16142 1727204140.54886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204140.54919: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204140.54947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204140.54981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204140.54997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204140.55032: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204140.55044: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204140.55052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204140.55168: Set connection var ansible_timeout to 10 16142 1727204140.55181: Set connection var ansible_connection to ssh 16142 1727204140.55198: Set connection var ansible_shell_type to sh 16142 1727204140.55208: Set connection var ansible_shell_executable to /bin/sh 16142 1727204140.55219: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204140.55230: Set connection var ansible_pipelining to False 16142 1727204140.55261: variable 'ansible_shell_executable' from source: unknown 16142 1727204140.55273: variable 'ansible_connection' from source: unknown 16142 1727204140.55280: variable 'ansible_module_compression' from source: unknown 16142 1727204140.55291: variable 'ansible_shell_type' from source: unknown 16142 1727204140.55304: variable 'ansible_shell_executable' from source: unknown 16142 1727204140.55311: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204140.55324: variable 'ansible_pipelining' from source: unknown 16142 1727204140.55333: variable 'ansible_timeout' from source: unknown 16142 1727204140.55352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204140.55497: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204140.55534: variable 'omit' from source: magic vars 16142 1727204140.55546: starting attempt loop 16142 1727204140.55553: running the handler 16142 1727204140.55668: variable 'ansible_facts' from source: unknown 16142 1727204140.56691: _low_level_execute_command(): starting 16142 1727204140.56707: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204140.57756: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204140.57778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204140.57794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.57825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204140.57880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204140.57892: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204140.57914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.57941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204140.57952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204140.57962: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204140.57977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204140.57992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.58008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204140.58025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204140.58049: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204140.58073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.58182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204140.58202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204140.58217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204140.58383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204140.59983: stdout chunk (state=3): >>>/root <<< 16142 1727204140.60098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204140.60188: stderr chunk (state=3): >>><<< 16142 1727204140.60198: stdout chunk (state=3): >>><<< 16142 1727204140.60311: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204140.60315: _low_level_execute_command(): starting 16142 1727204140.60317: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982 `" && echo ansible-tmp-1727204140.602187-19026-121802702442982="` echo /root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982 `" ) && sleep 0' 16142 1727204140.61427: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204140.61444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204140.61460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.61481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204140.61523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204140.61539: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204140.61555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.61575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204140.61587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204140.61600: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204140.61613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204140.61627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.61643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204140.61655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204140.61668: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204140.61683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.61757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204140.61988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204140.62009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204140.62601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204140.64411: stdout chunk (state=3): >>>ansible-tmp-1727204140.602187-19026-121802702442982=/root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982 <<< 16142 1727204140.64590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204140.64673: stderr chunk (state=3): >>><<< 16142 1727204140.64676: stdout chunk (state=3): >>><<< 16142 1727204140.64701: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204140.602187-19026-121802702442982=/root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204140.64736: variable 'ansible_module_compression' from source: unknown 16142 1727204140.64799: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 16142 1727204140.64861: variable 'ansible_facts' from source: unknown 16142 1727204140.65060: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982/AnsiballZ_systemd.py 16142 1727204140.65453: Sending initial data 16142 1727204140.65459: Sent initial data (155 bytes) 16142 1727204140.67170: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204140.67178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204140.67194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.67209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204140.67279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204140.67286: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204140.67301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.67314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204140.67323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204140.67333: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204140.67341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204140.67351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.67363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204140.67372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204140.67379: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204140.67388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.67524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204140.67571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204140.67584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204140.67736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204140.69456: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204140.69492: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204140.69531: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpz7tbezp1 /root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982/AnsiballZ_systemd.py <<< 16142 1727204140.69627: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204140.72585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204140.72725: stderr chunk (state=3): >>><<< 16142 1727204140.72730: stdout chunk (state=3): >>><<< 16142 1727204140.72753: done transferring module to remote 16142 1727204140.72762: _low_level_execute_command(): starting 16142 1727204140.72768: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982/ /root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982/AnsiballZ_systemd.py && sleep 0' 16142 1727204140.73240: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204140.73251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204140.73257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.73271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204140.73329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204140.73351: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.73384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.73395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204140.73398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204140.73406: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204140.73415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.73465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204140.73479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204140.73490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204140.73541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204140.75488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204140.75492: stdout chunk (state=3): >>><<< 16142 1727204140.75494: stderr chunk (state=3): >>><<< 16142 1727204140.75528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204140.75590: _low_level_execute_command(): starting 16142 1727204140.75593: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982/AnsiballZ_systemd.py && sleep 0' 16142 1727204140.76402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204140.76432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204140.76479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204140.76506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204140.76539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204140.76551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204140.76624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204141.02133: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 16142 1727204141.02176: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6836224", "MemoryAvailable": "infinity", "CPUUsageNSec": "1002523000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 16142 1727204141.02180: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 16142 1727204141.03786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204141.03790: stdout chunk (state=3): >>><<< 16142 1727204141.03795: stderr chunk (state=3): >>><<< 16142 1727204141.03826: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6836224", "MemoryAvailable": "infinity", "CPUUsageNSec": "1002523000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204141.04009: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204141.04028: _low_level_execute_command(): starting 16142 1727204141.04033: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204140.602187-19026-121802702442982/ > /dev/null 2>&1 && sleep 0' 16142 1727204141.05541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204141.05555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204141.05572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204141.05586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204141.05628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204141.05638: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204141.05645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204141.05670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204141.05677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204141.05683: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204141.05690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204141.05698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204141.05708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204141.05715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204141.05721: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204141.05729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204141.05803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204141.05821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204141.05832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204141.05901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204141.07785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204141.07789: stdout chunk (state=3): >>><<< 16142 1727204141.07797: stderr chunk (state=3): >>><<< 16142 1727204141.07812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204141.07819: handler run complete 16142 1727204141.07883: attempt loop complete, returning result 16142 1727204141.07887: _execute() done 16142 1727204141.07889: dumping result to json 16142 1727204141.07906: done dumping result, returning 16142 1727204141.07916: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-fddd-f6c7-0000000000db] 16142 1727204141.07921: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000db 16142 1727204141.08165: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000db 16142 1727204141.08169: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204141.08232: no more pending results, returning what we have 16142 1727204141.08236: results queue empty 16142 1727204141.08237: checking for any_errors_fatal 16142 1727204141.08243: done checking for any_errors_fatal 16142 1727204141.08244: checking for max_fail_percentage 16142 1727204141.08246: done checking for max_fail_percentage 16142 1727204141.08247: checking to see if all hosts have failed and the running result is not ok 16142 1727204141.08248: done checking to see if all hosts have failed 16142 1727204141.08249: getting the remaining hosts for this loop 16142 1727204141.08251: done getting the remaining hosts for this loop 16142 1727204141.08255: getting the next task for host managed-node2 16142 1727204141.08262: done getting next task for host managed-node2 16142 1727204141.08274: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204141.08277: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204141.08292: getting variables 16142 1727204141.08294: in VariableManager get_vars() 16142 1727204141.08354: Calling all_inventory to load vars for managed-node2 16142 1727204141.08357: Calling groups_inventory to load vars for managed-node2 16142 1727204141.08360: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204141.08374: Calling all_plugins_play to load vars for managed-node2 16142 1727204141.08379: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204141.08382: Calling groups_plugins_play to load vars for managed-node2 16142 1727204141.11477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204141.16939: done with get_vars() 16142 1727204141.17089: done getting variables 16142 1727204141.17155: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.826) 0:00:40.350 ***** 16142 1727204141.17412: entering _queue_task() for managed-node2/service 16142 1727204141.18128: worker is 1 (out of 1 available) 16142 1727204141.18143: exiting _queue_task() for managed-node2/service 16142 1727204141.18173: done queuing things up, now waiting for results queue to drain 16142 1727204141.18175: waiting for pending results... 16142 1727204141.19111: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204141.19446: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000dc 16142 1727204141.19580: variable 'ansible_search_path' from source: unknown 16142 1727204141.19585: variable 'ansible_search_path' from source: unknown 16142 1727204141.19626: calling self._execute() 16142 1727204141.19842: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204141.19849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204141.19862: variable 'omit' from source: magic vars 16142 1727204141.20712: variable 'ansible_distribution_major_version' from source: facts 16142 1727204141.20725: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204141.20969: variable 'network_provider' from source: set_fact 16142 1727204141.21101: Evaluated conditional (network_provider == "nm"): True 16142 1727204141.21341: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204141.21548: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204141.21846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204141.24883: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204141.24956: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204141.24994: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204141.25040: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204141.25066: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204141.25159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204141.25190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204141.25220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204141.25271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204141.25284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204141.25342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204141.25366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204141.25392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204141.25428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204141.25454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204141.25496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204141.25517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204141.25545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204141.25591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204141.25604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204141.25773: variable 'network_connections' from source: task vars 16142 1727204141.25789: variable 'controller_profile' from source: play vars 16142 1727204141.25855: variable 'controller_profile' from source: play vars 16142 1727204141.25868: variable 'controller_device' from source: play vars 16142 1727204141.25934: variable 'controller_device' from source: play vars 16142 1727204141.25943: variable 'port1_profile' from source: play vars 16142 1727204141.26011: variable 'port1_profile' from source: play vars 16142 1727204141.26019: variable 'dhcp_interface1' from source: play vars 16142 1727204141.26073: variable 'dhcp_interface1' from source: play vars 16142 1727204141.26083: variable 'controller_profile' from source: play vars 16142 1727204141.26143: variable 'controller_profile' from source: play vars 16142 1727204141.26150: variable 'port2_profile' from source: play vars 16142 1727204141.26217: variable 'port2_profile' from source: play vars 16142 1727204141.26223: variable 'dhcp_interface2' from source: play vars 16142 1727204141.26280: variable 'dhcp_interface2' from source: play vars 16142 1727204141.26286: variable 'controller_profile' from source: play vars 16142 1727204141.26379: variable 'controller_profile' from source: play vars 16142 1727204141.26458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204141.26655: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204141.26696: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204141.26726: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204141.26767: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204141.26816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204141.26840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204141.26879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204141.26903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204141.26959: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204141.27234: variable 'network_connections' from source: task vars 16142 1727204141.27240: variable 'controller_profile' from source: play vars 16142 1727204141.27313: variable 'controller_profile' from source: play vars 16142 1727204141.27321: variable 'controller_device' from source: play vars 16142 1727204141.27380: variable 'controller_device' from source: play vars 16142 1727204141.27393: variable 'port1_profile' from source: play vars 16142 1727204141.27454: variable 'port1_profile' from source: play vars 16142 1727204141.27461: variable 'dhcp_interface1' from source: play vars 16142 1727204141.27539: variable 'dhcp_interface1' from source: play vars 16142 1727204141.27543: variable 'controller_profile' from source: play vars 16142 1727204141.27601: variable 'controller_profile' from source: play vars 16142 1727204141.27615: variable 'port2_profile' from source: play vars 16142 1727204141.27687: variable 'port2_profile' from source: play vars 16142 1727204141.27694: variable 'dhcp_interface2' from source: play vars 16142 1727204141.27763: variable 'dhcp_interface2' from source: play vars 16142 1727204141.27769: variable 'controller_profile' from source: play vars 16142 1727204141.27820: variable 'controller_profile' from source: play vars 16142 1727204141.27879: Evaluated conditional (__network_wpa_supplicant_required): False 16142 1727204141.27882: when evaluation is False, skipping this task 16142 1727204141.27885: _execute() done 16142 1727204141.27888: dumping result to json 16142 1727204141.27890: done dumping result, returning 16142 1727204141.27900: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-fddd-f6c7-0000000000dc] 16142 1727204141.27906: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000dc 16142 1727204141.28008: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000dc 16142 1727204141.28012: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 16142 1727204141.28074: no more pending results, returning what we have 16142 1727204141.28078: results queue empty 16142 1727204141.28079: checking for any_errors_fatal 16142 1727204141.28095: done checking for any_errors_fatal 16142 1727204141.28096: checking for max_fail_percentage 16142 1727204141.28098: done checking for max_fail_percentage 16142 1727204141.28099: checking to see if all hosts have failed and the running result is not ok 16142 1727204141.28100: done checking to see if all hosts have failed 16142 1727204141.28101: getting the remaining hosts for this loop 16142 1727204141.28103: done getting the remaining hosts for this loop 16142 1727204141.28107: getting the next task for host managed-node2 16142 1727204141.28114: done getting next task for host managed-node2 16142 1727204141.28119: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204141.28122: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204141.28145: getting variables 16142 1727204141.28148: in VariableManager get_vars() 16142 1727204141.28212: Calling all_inventory to load vars for managed-node2 16142 1727204141.28215: Calling groups_inventory to load vars for managed-node2 16142 1727204141.28217: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204141.28229: Calling all_plugins_play to load vars for managed-node2 16142 1727204141.28232: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204141.28235: Calling groups_plugins_play to load vars for managed-node2 16142 1727204141.32363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204141.37520: done with get_vars() 16142 1727204141.37558: done getting variables 16142 1727204141.37741: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.203) 0:00:40.554 ***** 16142 1727204141.37780: entering _queue_task() for managed-node2/service 16142 1727204141.38574: worker is 1 (out of 1 available) 16142 1727204141.38588: exiting _queue_task() for managed-node2/service 16142 1727204141.38601: done queuing things up, now waiting for results queue to drain 16142 1727204141.38602: waiting for pending results... 16142 1727204141.38983: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204141.39119: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000dd 16142 1727204141.39133: variable 'ansible_search_path' from source: unknown 16142 1727204141.39140: variable 'ansible_search_path' from source: unknown 16142 1727204141.39183: calling self._execute() 16142 1727204141.39298: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204141.39309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204141.39323: variable 'omit' from source: magic vars 16142 1727204141.39881: variable 'ansible_distribution_major_version' from source: facts 16142 1727204141.39895: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204141.40081: variable 'network_provider' from source: set_fact 16142 1727204141.40086: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204141.40088: when evaluation is False, skipping this task 16142 1727204141.40094: _execute() done 16142 1727204141.40097: dumping result to json 16142 1727204141.40099: done dumping result, returning 16142 1727204141.40126: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-fddd-f6c7-0000000000dd] 16142 1727204141.40133: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000dd skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204141.40311: no more pending results, returning what we have 16142 1727204141.40316: results queue empty 16142 1727204141.40317: checking for any_errors_fatal 16142 1727204141.40328: done checking for any_errors_fatal 16142 1727204141.40329: checking for max_fail_percentage 16142 1727204141.40333: done checking for max_fail_percentage 16142 1727204141.40334: checking to see if all hosts have failed and the running result is not ok 16142 1727204141.40335: done checking to see if all hosts have failed 16142 1727204141.40336: getting the remaining hosts for this loop 16142 1727204141.40337: done getting the remaining hosts for this loop 16142 1727204141.40342: getting the next task for host managed-node2 16142 1727204141.40350: done getting next task for host managed-node2 16142 1727204141.40354: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204141.40358: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204141.40377: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000dd 16142 1727204141.40396: getting variables 16142 1727204141.40398: in VariableManager get_vars() 16142 1727204141.40462: Calling all_inventory to load vars for managed-node2 16142 1727204141.40469: Calling groups_inventory to load vars for managed-node2 16142 1727204141.40472: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204141.40479: WORKER PROCESS EXITING 16142 1727204141.40497: Calling all_plugins_play to load vars for managed-node2 16142 1727204141.40501: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204141.40504: Calling groups_plugins_play to load vars for managed-node2 16142 1727204141.42370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204141.45672: done with get_vars() 16142 1727204141.45819: done getting variables 16142 1727204141.45885: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.081) 0:00:40.636 ***** 16142 1727204141.46011: entering _queue_task() for managed-node2/copy 16142 1727204141.46760: worker is 1 (out of 1 available) 16142 1727204141.46777: exiting _queue_task() for managed-node2/copy 16142 1727204141.46789: done queuing things up, now waiting for results queue to drain 16142 1727204141.46902: waiting for pending results... 16142 1727204141.47632: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204141.48010: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000de 16142 1727204141.48172: variable 'ansible_search_path' from source: unknown 16142 1727204141.48270: variable 'ansible_search_path' from source: unknown 16142 1727204141.48316: calling self._execute() 16142 1727204141.48523: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204141.48536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204141.48552: variable 'omit' from source: magic vars 16142 1727204141.49419: variable 'ansible_distribution_major_version' from source: facts 16142 1727204141.49441: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204141.49696: variable 'network_provider' from source: set_fact 16142 1727204141.49709: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204141.49787: when evaluation is False, skipping this task 16142 1727204141.49796: _execute() done 16142 1727204141.49804: dumping result to json 16142 1727204141.49812: done dumping result, returning 16142 1727204141.49825: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-fddd-f6c7-0000000000de] 16142 1727204141.49836: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000de skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204141.50029: no more pending results, returning what we have 16142 1727204141.50034: results queue empty 16142 1727204141.50037: checking for any_errors_fatal 16142 1727204141.50041: done checking for any_errors_fatal 16142 1727204141.50042: checking for max_fail_percentage 16142 1727204141.50045: done checking for max_fail_percentage 16142 1727204141.50046: checking to see if all hosts have failed and the running result is not ok 16142 1727204141.50046: done checking to see if all hosts have failed 16142 1727204141.50047: getting the remaining hosts for this loop 16142 1727204141.50049: done getting the remaining hosts for this loop 16142 1727204141.50052: getting the next task for host managed-node2 16142 1727204141.50059: done getting next task for host managed-node2 16142 1727204141.50063: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204141.50068: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204141.50096: getting variables 16142 1727204141.50098: in VariableManager get_vars() 16142 1727204141.50158: Calling all_inventory to load vars for managed-node2 16142 1727204141.50162: Calling groups_inventory to load vars for managed-node2 16142 1727204141.50167: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204141.50180: Calling all_plugins_play to load vars for managed-node2 16142 1727204141.50182: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204141.50186: Calling groups_plugins_play to load vars for managed-node2 16142 1727204141.51204: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000de 16142 1727204141.51208: WORKER PROCESS EXITING 16142 1727204141.53055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204141.55415: done with get_vars() 16142 1727204141.55459: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.096) 0:00:40.733 ***** 16142 1727204141.55650: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204141.56291: worker is 1 (out of 1 available) 16142 1727204141.56305: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204141.56319: done queuing things up, now waiting for results queue to drain 16142 1727204141.56320: waiting for pending results... 16142 1727204141.57887: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204141.58068: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000df 16142 1727204141.58134: variable 'ansible_search_path' from source: unknown 16142 1727204141.58144: variable 'ansible_search_path' from source: unknown 16142 1727204141.58557: calling self._execute() 16142 1727204141.58675: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204141.58686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204141.58702: variable 'omit' from source: magic vars 16142 1727204141.59116: variable 'ansible_distribution_major_version' from source: facts 16142 1727204141.59137: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204141.59149: variable 'omit' from source: magic vars 16142 1727204141.59225: variable 'omit' from source: magic vars 16142 1727204141.59397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204141.61970: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204141.62071: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204141.62178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204141.62255: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204141.62291: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204141.62460: variable 'network_provider' from source: set_fact 16142 1727204141.62698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204141.62756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204141.62795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204141.62851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204141.62874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204141.62980: variable 'omit' from source: magic vars 16142 1727204141.63200: variable 'omit' from source: magic vars 16142 1727204141.63545: variable 'network_connections' from source: task vars 16142 1727204141.63562: variable 'controller_profile' from source: play vars 16142 1727204141.63709: variable 'controller_profile' from source: play vars 16142 1727204141.63737: variable 'controller_device' from source: play vars 16142 1727204141.63811: variable 'controller_device' from source: play vars 16142 1727204141.63828: variable 'port1_profile' from source: play vars 16142 1727204141.63957: variable 'port1_profile' from source: play vars 16142 1727204141.63973: variable 'dhcp_interface1' from source: play vars 16142 1727204141.64149: variable 'dhcp_interface1' from source: play vars 16142 1727204141.64169: variable 'controller_profile' from source: play vars 16142 1727204141.64231: variable 'controller_profile' from source: play vars 16142 1727204141.64394: variable 'port2_profile' from source: play vars 16142 1727204141.64459: variable 'port2_profile' from source: play vars 16142 1727204141.64476: variable 'dhcp_interface2' from source: play vars 16142 1727204141.64547: variable 'dhcp_interface2' from source: play vars 16142 1727204141.64719: variable 'controller_profile' from source: play vars 16142 1727204141.64787: variable 'controller_profile' from source: play vars 16142 1727204141.65260: variable 'omit' from source: magic vars 16142 1727204141.65332: variable '__lsr_ansible_managed' from source: task vars 16142 1727204141.65453: variable '__lsr_ansible_managed' from source: task vars 16142 1727204141.65928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 16142 1727204141.66192: Loaded config def from plugin (lookup/template) 16142 1727204141.66201: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 16142 1727204141.66230: File lookup term: get_ansible_managed.j2 16142 1727204141.66238: variable 'ansible_search_path' from source: unknown 16142 1727204141.66251: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 16142 1727204141.66271: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 16142 1727204141.66292: variable 'ansible_search_path' from source: unknown 16142 1727204141.86474: variable 'ansible_managed' from source: unknown 16142 1727204141.86663: variable 'omit' from source: magic vars 16142 1727204141.86698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204141.86739: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204141.86766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204141.86788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204141.86802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204141.86839: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204141.86850: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204141.86861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204141.86978: Set connection var ansible_timeout to 10 16142 1727204141.86986: Set connection var ansible_connection to ssh 16142 1727204141.86996: Set connection var ansible_shell_type to sh 16142 1727204141.87006: Set connection var ansible_shell_executable to /bin/sh 16142 1727204141.87016: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204141.87041: Set connection var ansible_pipelining to False 16142 1727204141.87081: variable 'ansible_shell_executable' from source: unknown 16142 1727204141.87089: variable 'ansible_connection' from source: unknown 16142 1727204141.87096: variable 'ansible_module_compression' from source: unknown 16142 1727204141.87103: variable 'ansible_shell_type' from source: unknown 16142 1727204141.87110: variable 'ansible_shell_executable' from source: unknown 16142 1727204141.87116: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204141.87124: variable 'ansible_pipelining' from source: unknown 16142 1727204141.87132: variable 'ansible_timeout' from source: unknown 16142 1727204141.87143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204141.87295: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204141.87310: variable 'omit' from source: magic vars 16142 1727204141.87323: starting attempt loop 16142 1727204141.87330: running the handler 16142 1727204141.87349: _low_level_execute_command(): starting 16142 1727204141.87359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204141.88184: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204141.88199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204141.88213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204141.88231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204141.88980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204141.88994: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204141.89011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204141.89037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204141.89051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204141.89061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204141.89076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204141.89090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204141.89106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204141.89117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204141.89137: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204141.89153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204141.89231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204141.89377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204141.89394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204141.89484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204141.91191: stdout chunk (state=3): >>>/root <<< 16142 1727204141.91395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204141.91399: stdout chunk (state=3): >>><<< 16142 1727204141.91401: stderr chunk (state=3): >>><<< 16142 1727204141.91892: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204141.91896: _low_level_execute_command(): starting 16142 1727204141.91899: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443 `" && echo ansible-tmp-1727204141.917932-19074-19864657385443="` echo /root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443 `" ) && sleep 0' 16142 1727204141.93058: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204141.93063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204141.93093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204141.93096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204141.93099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204141.93230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204141.93238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204141.93253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204141.95282: stdout chunk (state=3): >>>ansible-tmp-1727204141.917932-19074-19864657385443=/root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443 <<< 16142 1727204141.95390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204141.95479: stderr chunk (state=3): >>><<< 16142 1727204141.95483: stdout chunk (state=3): >>><<< 16142 1727204141.95775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204141.917932-19074-19864657385443=/root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204141.95782: variable 'ansible_module_compression' from source: unknown 16142 1727204141.95785: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 16142 1727204141.95787: variable 'ansible_facts' from source: unknown 16142 1727204141.95789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443/AnsiballZ_network_connections.py 16142 1727204141.96394: Sending initial data 16142 1727204141.96398: Sent initial data (166 bytes) 16142 1727204141.99014: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204141.99019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204141.99183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204141.99188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204141.99204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204141.99207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204141.99220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204141.99307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204141.99384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204141.99397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204141.99472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204142.01363: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204142.01438: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204142.01461: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpug1ljks6 /root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443/AnsiballZ_network_connections.py <<< 16142 1727204142.01597: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204142.03685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204142.03773: stderr chunk (state=3): >>><<< 16142 1727204142.03777: stdout chunk (state=3): >>><<< 16142 1727204142.03799: done transferring module to remote 16142 1727204142.03810: _low_level_execute_command(): starting 16142 1727204142.03813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443/ /root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443/AnsiballZ_network_connections.py && sleep 0' 16142 1727204142.05441: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204142.05446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204142.05604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204142.05608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204142.05624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204142.05630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204142.05786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204142.06078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204142.07918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204142.07922: stderr chunk (state=3): >>><<< 16142 1727204142.07924: stdout chunk (state=3): >>><<< 16142 1727204142.07943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204142.07946: _low_level_execute_command(): starting 16142 1727204142.07949: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443/AnsiballZ_network_connections.py && sleep 0' 16142 1727204142.09407: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204142.09672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204142.09684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204142.09699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204142.09744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204142.09751: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204142.09765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204142.09785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204142.09792: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204142.09799: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204142.09807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204142.09817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204142.09829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204142.09839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204142.09842: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204142.09852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204142.09928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204142.09944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204142.09951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204142.10061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204142.56888: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 16142 1727204142.59471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204142.59475: stdout chunk (state=3): >>><<< 16142 1727204142.59478: stderr chunk (state=3): >>><<< 16142 1727204142.59480: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204142.59483: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204142.59490: _low_level_execute_command(): starting 16142 1727204142.59497: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204141.917932-19074-19864657385443/ > /dev/null 2>&1 && sleep 0' 16142 1727204142.61043: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204142.61095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204142.61105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204142.61119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204142.61158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204142.61199: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204142.61208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204142.61223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204142.61230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204142.61239: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204142.61246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204142.61309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204142.61321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204142.61329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204142.61339: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204142.61346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204142.61492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204142.61506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204142.61513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204142.61744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204142.63581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204142.63599: stderr chunk (state=3): >>><<< 16142 1727204142.63602: stdout chunk (state=3): >>><<< 16142 1727204142.63621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204142.63629: handler run complete 16142 1727204142.63669: attempt loop complete, returning result 16142 1727204142.63673: _execute() done 16142 1727204142.63675: dumping result to json 16142 1727204142.63682: done dumping result, returning 16142 1727204142.63692: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-fddd-f6c7-0000000000df] 16142 1727204142.63695: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000df 16142 1727204142.63827: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000df 16142 1727204142.63830: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 [008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 [009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active) 16142 1727204142.63962: no more pending results, returning what we have 16142 1727204142.63967: results queue empty 16142 1727204142.63968: checking for any_errors_fatal 16142 1727204142.63972: done checking for any_errors_fatal 16142 1727204142.63973: checking for max_fail_percentage 16142 1727204142.63975: done checking for max_fail_percentage 16142 1727204142.63975: checking to see if all hosts have failed and the running result is not ok 16142 1727204142.63976: done checking to see if all hosts have failed 16142 1727204142.63977: getting the remaining hosts for this loop 16142 1727204142.63978: done getting the remaining hosts for this loop 16142 1727204142.63981: getting the next task for host managed-node2 16142 1727204142.63986: done getting next task for host managed-node2 16142 1727204142.63990: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204142.63992: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204142.64009: getting variables 16142 1727204142.64010: in VariableManager get_vars() 16142 1727204142.64060: Calling all_inventory to load vars for managed-node2 16142 1727204142.64062: Calling groups_inventory to load vars for managed-node2 16142 1727204142.64069: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204142.64078: Calling all_plugins_play to load vars for managed-node2 16142 1727204142.64080: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204142.64083: Calling groups_plugins_play to load vars for managed-node2 16142 1727204142.66703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204142.82529: done with get_vars() 16142 1727204142.82562: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:42 -0400 (0:00:01.271) 0:00:42.004 ***** 16142 1727204142.82769: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204142.83607: worker is 1 (out of 1 available) 16142 1727204142.83622: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204142.83634: done queuing things up, now waiting for results queue to drain 16142 1727204142.83638: waiting for pending results... 16142 1727204142.84914: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204142.85438: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000e0 16142 1727204142.85474: variable 'ansible_search_path' from source: unknown 16142 1727204142.85478: variable 'ansible_search_path' from source: unknown 16142 1727204142.85628: calling self._execute() 16142 1727204142.86080: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204142.86100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204142.86111: variable 'omit' from source: magic vars 16142 1727204142.87471: variable 'ansible_distribution_major_version' from source: facts 16142 1727204142.87652: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204142.87897: variable 'network_state' from source: role '' defaults 16142 1727204142.87910: Evaluated conditional (network_state != {}): False 16142 1727204142.87913: when evaluation is False, skipping this task 16142 1727204142.87916: _execute() done 16142 1727204142.87919: dumping result to json 16142 1727204142.87923: done dumping result, returning 16142 1727204142.87933: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-fddd-f6c7-0000000000e0] 16142 1727204142.87940: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e0 16142 1727204142.88058: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e0 16142 1727204142.88062: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204142.88126: no more pending results, returning what we have 16142 1727204142.88131: results queue empty 16142 1727204142.88132: checking for any_errors_fatal 16142 1727204142.88149: done checking for any_errors_fatal 16142 1727204142.88150: checking for max_fail_percentage 16142 1727204142.88152: done checking for max_fail_percentage 16142 1727204142.88153: checking to see if all hosts have failed and the running result is not ok 16142 1727204142.88154: done checking to see if all hosts have failed 16142 1727204142.88155: getting the remaining hosts for this loop 16142 1727204142.88157: done getting the remaining hosts for this loop 16142 1727204142.88161: getting the next task for host managed-node2 16142 1727204142.88177: done getting next task for host managed-node2 16142 1727204142.88181: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204142.88184: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204142.88209: getting variables 16142 1727204142.88211: in VariableManager get_vars() 16142 1727204142.88270: Calling all_inventory to load vars for managed-node2 16142 1727204142.88273: Calling groups_inventory to load vars for managed-node2 16142 1727204142.88275: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204142.88288: Calling all_plugins_play to load vars for managed-node2 16142 1727204142.88291: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204142.88293: Calling groups_plugins_play to load vars for managed-node2 16142 1727204142.90570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204142.93322: done with get_vars() 16142 1727204142.93371: done getting variables 16142 1727204142.93461: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.107) 0:00:42.111 ***** 16142 1727204142.93516: entering _queue_task() for managed-node2/debug 16142 1727204142.93897: worker is 1 (out of 1 available) 16142 1727204142.93912: exiting _queue_task() for managed-node2/debug 16142 1727204142.93925: done queuing things up, now waiting for results queue to drain 16142 1727204142.93926: waiting for pending results... 16142 1727204142.94259: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204142.94396: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000e1 16142 1727204142.94411: variable 'ansible_search_path' from source: unknown 16142 1727204142.94416: variable 'ansible_search_path' from source: unknown 16142 1727204142.94456: calling self._execute() 16142 1727204142.94561: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204142.94567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204142.94580: variable 'omit' from source: magic vars 16142 1727204142.95008: variable 'ansible_distribution_major_version' from source: facts 16142 1727204142.95022: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204142.95032: variable 'omit' from source: magic vars 16142 1727204142.95105: variable 'omit' from source: magic vars 16142 1727204142.95144: variable 'omit' from source: magic vars 16142 1727204142.95193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204142.95230: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204142.95254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204142.95278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204142.95289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204142.95360: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204142.95366: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204142.95369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204142.95571: Set connection var ansible_timeout to 10 16142 1727204142.95574: Set connection var ansible_connection to ssh 16142 1727204142.95577: Set connection var ansible_shell_type to sh 16142 1727204142.95579: Set connection var ansible_shell_executable to /bin/sh 16142 1727204142.95581: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204142.95583: Set connection var ansible_pipelining to False 16142 1727204142.95586: variable 'ansible_shell_executable' from source: unknown 16142 1727204142.95588: variable 'ansible_connection' from source: unknown 16142 1727204142.95591: variable 'ansible_module_compression' from source: unknown 16142 1727204142.95593: variable 'ansible_shell_type' from source: unknown 16142 1727204142.95595: variable 'ansible_shell_executable' from source: unknown 16142 1727204142.95597: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204142.95599: variable 'ansible_pipelining' from source: unknown 16142 1727204142.95602: variable 'ansible_timeout' from source: unknown 16142 1727204142.95604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204142.95775: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204142.95780: variable 'omit' from source: magic vars 16142 1727204142.95782: starting attempt loop 16142 1727204142.95784: running the handler 16142 1727204142.95892: variable '__network_connections_result' from source: set_fact 16142 1727204142.95968: handler run complete 16142 1727204142.95987: attempt loop complete, returning result 16142 1727204142.95990: _execute() done 16142 1727204142.95993: dumping result to json 16142 1727204142.95995: done dumping result, returning 16142 1727204142.96003: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-fddd-f6c7-0000000000e1] 16142 1727204142.96016: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e1 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)" ] } 16142 1727204142.96193: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e1 16142 1727204142.96196: WORKER PROCESS EXITING 16142 1727204142.96211: no more pending results, returning what we have 16142 1727204142.96215: results queue empty 16142 1727204142.96216: checking for any_errors_fatal 16142 1727204142.96222: done checking for any_errors_fatal 16142 1727204142.96223: checking for max_fail_percentage 16142 1727204142.96225: done checking for max_fail_percentage 16142 1727204142.96226: checking to see if all hosts have failed and the running result is not ok 16142 1727204142.96226: done checking to see if all hosts have failed 16142 1727204142.96227: getting the remaining hosts for this loop 16142 1727204142.96229: done getting the remaining hosts for this loop 16142 1727204142.96233: getting the next task for host managed-node2 16142 1727204142.96239: done getting next task for host managed-node2 16142 1727204142.96243: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204142.96247: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204142.96261: getting variables 16142 1727204142.96265: in VariableManager get_vars() 16142 1727204142.96328: Calling all_inventory to load vars for managed-node2 16142 1727204142.96331: Calling groups_inventory to load vars for managed-node2 16142 1727204142.96334: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204142.96345: Calling all_plugins_play to load vars for managed-node2 16142 1727204142.96348: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204142.96351: Calling groups_plugins_play to load vars for managed-node2 16142 1727204142.98352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204143.00669: done with get_vars() 16142 1727204143.00693: done getting variables 16142 1727204143.00747: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.072) 0:00:42.184 ***** 16142 1727204143.00786: entering _queue_task() for managed-node2/debug 16142 1727204143.01112: worker is 1 (out of 1 available) 16142 1727204143.01126: exiting _queue_task() for managed-node2/debug 16142 1727204143.01141: done queuing things up, now waiting for results queue to drain 16142 1727204143.01142: waiting for pending results... 16142 1727204143.01541: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204143.01999: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000e2 16142 1727204143.02004: variable 'ansible_search_path' from source: unknown 16142 1727204143.02007: variable 'ansible_search_path' from source: unknown 16142 1727204143.02010: calling self._execute() 16142 1727204143.02014: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.02017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.02020: variable 'omit' from source: magic vars 16142 1727204143.02306: variable 'ansible_distribution_major_version' from source: facts 16142 1727204143.02310: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204143.02312: variable 'omit' from source: magic vars 16142 1727204143.02670: variable 'omit' from source: magic vars 16142 1727204143.02675: variable 'omit' from source: magic vars 16142 1727204143.02683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204143.02687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204143.02689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204143.02692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204143.02695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204143.02697: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204143.02700: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.02702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.02705: Set connection var ansible_timeout to 10 16142 1727204143.02707: Set connection var ansible_connection to ssh 16142 1727204143.02709: Set connection var ansible_shell_type to sh 16142 1727204143.02711: Set connection var ansible_shell_executable to /bin/sh 16142 1727204143.02713: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204143.02720: Set connection var ansible_pipelining to False 16142 1727204143.02742: variable 'ansible_shell_executable' from source: unknown 16142 1727204143.02745: variable 'ansible_connection' from source: unknown 16142 1727204143.02748: variable 'ansible_module_compression' from source: unknown 16142 1727204143.02750: variable 'ansible_shell_type' from source: unknown 16142 1727204143.02752: variable 'ansible_shell_executable' from source: unknown 16142 1727204143.02754: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.02756: variable 'ansible_pipelining' from source: unknown 16142 1727204143.02758: variable 'ansible_timeout' from source: unknown 16142 1727204143.02761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.03180: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204143.03184: variable 'omit' from source: magic vars 16142 1727204143.03186: starting attempt loop 16142 1727204143.03189: running the handler 16142 1727204143.03191: variable '__network_connections_result' from source: set_fact 16142 1727204143.03193: variable '__network_connections_result' from source: set_fact 16142 1727204143.03257: handler run complete 16142 1727204143.03288: attempt loop complete, returning result 16142 1727204143.03291: _execute() done 16142 1727204143.03294: dumping result to json 16142 1727204143.03299: done dumping result, returning 16142 1727204143.03309: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-fddd-f6c7-0000000000e2] 16142 1727204143.03315: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e2 16142 1727204143.03430: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e2 16142 1727204143.03433: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 72f186f1-b611-4f2e-9d00-de0c3bf7aa23 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 9eebdc14-bdb2-41b3-94db-1a5b2e988b68 (not-active)" ] } } 16142 1727204143.03554: no more pending results, returning what we have 16142 1727204143.03558: results queue empty 16142 1727204143.03568: checking for any_errors_fatal 16142 1727204143.03576: done checking for any_errors_fatal 16142 1727204143.03577: checking for max_fail_percentage 16142 1727204143.03579: done checking for max_fail_percentage 16142 1727204143.03580: checking to see if all hosts have failed and the running result is not ok 16142 1727204143.03580: done checking to see if all hosts have failed 16142 1727204143.03581: getting the remaining hosts for this loop 16142 1727204143.03583: done getting the remaining hosts for this loop 16142 1727204143.03586: getting the next task for host managed-node2 16142 1727204143.03596: done getting next task for host managed-node2 16142 1727204143.03600: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204143.03603: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204143.03618: getting variables 16142 1727204143.03620: in VariableManager get_vars() 16142 1727204143.03678: Calling all_inventory to load vars for managed-node2 16142 1727204143.03681: Calling groups_inventory to load vars for managed-node2 16142 1727204143.03683: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204143.03694: Calling all_plugins_play to load vars for managed-node2 16142 1727204143.03697: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204143.03700: Calling groups_plugins_play to load vars for managed-node2 16142 1727204143.06103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204143.08914: done with get_vars() 16142 1727204143.08976: done getting variables 16142 1727204143.09034: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.082) 0:00:42.267 ***** 16142 1727204143.09073: entering _queue_task() for managed-node2/debug 16142 1727204143.09413: worker is 1 (out of 1 available) 16142 1727204143.09428: exiting _queue_task() for managed-node2/debug 16142 1727204143.09444: done queuing things up, now waiting for results queue to drain 16142 1727204143.09446: waiting for pending results... 16142 1727204143.10546: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204143.10722: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000e3 16142 1727204143.10748: variable 'ansible_search_path' from source: unknown 16142 1727204143.10757: variable 'ansible_search_path' from source: unknown 16142 1727204143.10805: calling self._execute() 16142 1727204143.10919: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.10932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.10951: variable 'omit' from source: magic vars 16142 1727204143.11373: variable 'ansible_distribution_major_version' from source: facts 16142 1727204143.11392: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204143.11524: variable 'network_state' from source: role '' defaults 16142 1727204143.11544: Evaluated conditional (network_state != {}): False 16142 1727204143.11557: when evaluation is False, skipping this task 16142 1727204143.11568: _execute() done 16142 1727204143.11577: dumping result to json 16142 1727204143.11586: done dumping result, returning 16142 1727204143.11598: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-fddd-f6c7-0000000000e3] 16142 1727204143.11610: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e3 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 16142 1727204143.11775: no more pending results, returning what we have 16142 1727204143.11780: results queue empty 16142 1727204143.11781: checking for any_errors_fatal 16142 1727204143.11795: done checking for any_errors_fatal 16142 1727204143.11796: checking for max_fail_percentage 16142 1727204143.11798: done checking for max_fail_percentage 16142 1727204143.11800: checking to see if all hosts have failed and the running result is not ok 16142 1727204143.11800: done checking to see if all hosts have failed 16142 1727204143.11801: getting the remaining hosts for this loop 16142 1727204143.11803: done getting the remaining hosts for this loop 16142 1727204143.11807: getting the next task for host managed-node2 16142 1727204143.11814: done getting next task for host managed-node2 16142 1727204143.11818: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204143.11822: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204143.11849: getting variables 16142 1727204143.11851: in VariableManager get_vars() 16142 1727204143.11912: Calling all_inventory to load vars for managed-node2 16142 1727204143.11916: Calling groups_inventory to load vars for managed-node2 16142 1727204143.11919: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204143.11931: Calling all_plugins_play to load vars for managed-node2 16142 1727204143.11937: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204143.11941: Calling groups_plugins_play to load vars for managed-node2 16142 1727204143.13486: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e3 16142 1727204143.13490: WORKER PROCESS EXITING 16142 1727204143.13845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204143.15511: done with get_vars() 16142 1727204143.15547: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.065) 0:00:42.333 ***** 16142 1727204143.15661: entering _queue_task() for managed-node2/ping 16142 1727204143.16017: worker is 1 (out of 1 available) 16142 1727204143.16031: exiting _queue_task() for managed-node2/ping 16142 1727204143.16047: done queuing things up, now waiting for results queue to drain 16142 1727204143.16048: waiting for pending results... 16142 1727204143.16389: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204143.16548: in run() - task 0affcd87-79f5-fddd-f6c7-0000000000e4 16142 1727204143.16574: variable 'ansible_search_path' from source: unknown 16142 1727204143.16582: variable 'ansible_search_path' from source: unknown 16142 1727204143.16632: calling self._execute() 16142 1727204143.16745: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.16758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.16775: variable 'omit' from source: magic vars 16142 1727204143.17195: variable 'ansible_distribution_major_version' from source: facts 16142 1727204143.17215: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204143.17227: variable 'omit' from source: magic vars 16142 1727204143.17310: variable 'omit' from source: magic vars 16142 1727204143.17355: variable 'omit' from source: magic vars 16142 1727204143.17407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204143.17450: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204143.17481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204143.17645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204143.17663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204143.17700: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204143.17708: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.17715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.17825: Set connection var ansible_timeout to 10 16142 1727204143.17833: Set connection var ansible_connection to ssh 16142 1727204143.17849: Set connection var ansible_shell_type to sh 16142 1727204143.17864: Set connection var ansible_shell_executable to /bin/sh 16142 1727204143.17876: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204143.17886: Set connection var ansible_pipelining to False 16142 1727204143.17912: variable 'ansible_shell_executable' from source: unknown 16142 1727204143.17918: variable 'ansible_connection' from source: unknown 16142 1727204143.17924: variable 'ansible_module_compression' from source: unknown 16142 1727204143.17929: variable 'ansible_shell_type' from source: unknown 16142 1727204143.17934: variable 'ansible_shell_executable' from source: unknown 16142 1727204143.17942: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.17948: variable 'ansible_pipelining' from source: unknown 16142 1727204143.17953: variable 'ansible_timeout' from source: unknown 16142 1727204143.17959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.18166: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204143.18205: variable 'omit' from source: magic vars 16142 1727204143.18213: starting attempt loop 16142 1727204143.18219: running the handler 16142 1727204143.18237: _low_level_execute_command(): starting 16142 1727204143.18250: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204143.19010: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204143.19026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.19043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.19069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.19109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.19121: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204143.19138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.19157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204143.19172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204143.19184: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204143.19197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.19212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.19228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.19242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.19251: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204143.19262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.19344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204143.19370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204143.19393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204143.19479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204143.21166: stdout chunk (state=3): >>>/root <<< 16142 1727204143.21494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204143.21498: stdout chunk (state=3): >>><<< 16142 1727204143.21500: stderr chunk (state=3): >>><<< 16142 1727204143.21572: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204143.21577: _low_level_execute_command(): starting 16142 1727204143.21580: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234 `" && echo ansible-tmp-1727204143.2153187-19144-100925076502234="` echo /root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234 `" ) && sleep 0' 16142 1727204143.23590: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.23593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.23618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.23748: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204143.23752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204143.23754: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.23757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.23827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204143.23830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204143.23995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204143.24054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204143.25880: stdout chunk (state=3): >>>ansible-tmp-1727204143.2153187-19144-100925076502234=/root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234 <<< 16142 1727204143.25992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204143.26061: stderr chunk (state=3): >>><<< 16142 1727204143.26067: stdout chunk (state=3): >>><<< 16142 1727204143.26373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204143.2153187-19144-100925076502234=/root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204143.26377: variable 'ansible_module_compression' from source: unknown 16142 1727204143.26380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 16142 1727204143.26382: variable 'ansible_facts' from source: unknown 16142 1727204143.26384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234/AnsiballZ_ping.py 16142 1727204143.26451: Sending initial data 16142 1727204143.26454: Sent initial data (153 bytes) 16142 1727204143.27499: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204143.27514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.27529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.27549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.27601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.27616: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204143.27631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.27649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204143.27662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204143.27676: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204143.27693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.27712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.27730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.27743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.27756: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204143.27773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.27856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204143.27882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204143.27899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204143.27974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204143.29718: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204143.29760: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204143.29805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpsc9tefpu /root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234/AnsiballZ_ping.py <<< 16142 1727204143.29809: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204143.31243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204143.31443: stderr chunk (state=3): >>><<< 16142 1727204143.31446: stdout chunk (state=3): >>><<< 16142 1727204143.31449: done transferring module to remote 16142 1727204143.31451: _low_level_execute_command(): starting 16142 1727204143.31454: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234/ /root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234/AnsiballZ_ping.py && sleep 0' 16142 1727204143.32482: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204143.32581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.32591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.32611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.32652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.32716: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204143.32732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.32746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204143.32756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204143.32761: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204143.32771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.32781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.32792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.32799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.32806: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204143.32818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.32898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204143.33069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204143.33073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204143.33170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204143.34955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204143.34958: stdout chunk (state=3): >>><<< 16142 1727204143.34983: stderr chunk (state=3): >>><<< 16142 1727204143.34994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204143.34999: _low_level_execute_command(): starting 16142 1727204143.35029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234/AnsiballZ_ping.py && sleep 0' 16142 1727204143.35680: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204143.35687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.35697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.35709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.35998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.36090: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204143.36093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.36096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204143.36098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204143.36101: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204143.36103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.36105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.36107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.36109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.36111: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204143.36113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.36115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204143.36117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204143.36119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204143.36166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204143.49056: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 16142 1727204143.50119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204143.50124: stderr chunk (state=3): >>><<< 16142 1727204143.50126: stdout chunk (state=3): >>><<< 16142 1727204143.50152: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204143.50180: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204143.50187: _low_level_execute_command(): starting 16142 1727204143.50194: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204143.2153187-19144-100925076502234/ > /dev/null 2>&1 && sleep 0' 16142 1727204143.51876: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204143.51934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.51944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.51958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.52369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.52373: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204143.52375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.52377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204143.52380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204143.52382: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204143.52384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204143.52386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204143.52388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204143.52390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204143.52392: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204143.52394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204143.52475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204143.52490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204143.52494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204143.52681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204143.54482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204143.54509: stderr chunk (state=3): >>><<< 16142 1727204143.54513: stdout chunk (state=3): >>><<< 16142 1727204143.54538: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204143.54542: handler run complete 16142 1727204143.54558: attempt loop complete, returning result 16142 1727204143.54561: _execute() done 16142 1727204143.54565: dumping result to json 16142 1727204143.54567: done dumping result, returning 16142 1727204143.54580: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-fddd-f6c7-0000000000e4] 16142 1727204143.54582: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e4 16142 1727204143.54683: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000000e4 16142 1727204143.54686: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 16142 1727204143.54751: no more pending results, returning what we have 16142 1727204143.54755: results queue empty 16142 1727204143.54755: checking for any_errors_fatal 16142 1727204143.54763: done checking for any_errors_fatal 16142 1727204143.54765: checking for max_fail_percentage 16142 1727204143.54767: done checking for max_fail_percentage 16142 1727204143.54768: checking to see if all hosts have failed and the running result is not ok 16142 1727204143.54769: done checking to see if all hosts have failed 16142 1727204143.54770: getting the remaining hosts for this loop 16142 1727204143.54771: done getting the remaining hosts for this loop 16142 1727204143.54774: getting the next task for host managed-node2 16142 1727204143.54783: done getting next task for host managed-node2 16142 1727204143.54785: ^ task is: TASK: meta (role_complete) 16142 1727204143.54788: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204143.54803: getting variables 16142 1727204143.54804: in VariableManager get_vars() 16142 1727204143.54859: Calling all_inventory to load vars for managed-node2 16142 1727204143.54862: Calling groups_inventory to load vars for managed-node2 16142 1727204143.54865: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204143.54875: Calling all_plugins_play to load vars for managed-node2 16142 1727204143.54877: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204143.54880: Calling groups_plugins_play to load vars for managed-node2 16142 1727204143.57686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204143.61053: done with get_vars() 16142 1727204143.61082: done getting variables 16142 1727204143.61182: done queuing things up, now waiting for results queue to drain 16142 1727204143.61185: results queue empty 16142 1727204143.61186: checking for any_errors_fatal 16142 1727204143.61188: done checking for any_errors_fatal 16142 1727204143.61189: checking for max_fail_percentage 16142 1727204143.61191: done checking for max_fail_percentage 16142 1727204143.61191: checking to see if all hosts have failed and the running result is not ok 16142 1727204143.61192: done checking to see if all hosts have failed 16142 1727204143.61193: getting the remaining hosts for this loop 16142 1727204143.61194: done getting the remaining hosts for this loop 16142 1727204143.61197: getting the next task for host managed-node2 16142 1727204143.61202: done getting next task for host managed-node2 16142 1727204143.61206: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204143.61212: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204143.61234: getting variables 16142 1727204143.61238: in VariableManager get_vars() 16142 1727204143.61265: Calling all_inventory to load vars for managed-node2 16142 1727204143.61267: Calling groups_inventory to load vars for managed-node2 16142 1727204143.61270: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204143.61275: Calling all_plugins_play to load vars for managed-node2 16142 1727204143.61277: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204143.61280: Calling groups_plugins_play to load vars for managed-node2 16142 1727204143.64426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204143.67813: done with get_vars() 16142 1727204143.67850: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.522) 0:00:42.856 ***** 16142 1727204143.67941: entering _queue_task() for managed-node2/include_tasks 16142 1727204143.68994: worker is 1 (out of 1 available) 16142 1727204143.69008: exiting _queue_task() for managed-node2/include_tasks 16142 1727204143.69023: done queuing things up, now waiting for results queue to drain 16142 1727204143.69024: waiting for pending results... 16142 1727204143.70641: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204143.71351: in run() - task 0affcd87-79f5-fddd-f6c7-00000000011b 16142 1727204143.71367: variable 'ansible_search_path' from source: unknown 16142 1727204143.71371: variable 'ansible_search_path' from source: unknown 16142 1727204143.71671: calling self._execute() 16142 1727204143.71932: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.71938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.72175: variable 'omit' from source: magic vars 16142 1727204143.73243: variable 'ansible_distribution_major_version' from source: facts 16142 1727204143.73480: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204143.73487: _execute() done 16142 1727204143.73491: dumping result to json 16142 1727204143.73495: done dumping result, returning 16142 1727204143.73505: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-fddd-f6c7-00000000011b] 16142 1727204143.73510: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011b 16142 1727204143.73616: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011b 16142 1727204143.73619: WORKER PROCESS EXITING 16142 1727204143.73670: no more pending results, returning what we have 16142 1727204143.73676: in VariableManager get_vars() 16142 1727204143.73748: Calling all_inventory to load vars for managed-node2 16142 1727204143.73751: Calling groups_inventory to load vars for managed-node2 16142 1727204143.73754: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204143.73768: Calling all_plugins_play to load vars for managed-node2 16142 1727204143.73771: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204143.73774: Calling groups_plugins_play to load vars for managed-node2 16142 1727204143.76696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204143.79605: done with get_vars() 16142 1727204143.79645: variable 'ansible_search_path' from source: unknown 16142 1727204143.79647: variable 'ansible_search_path' from source: unknown 16142 1727204143.79702: we have included files to process 16142 1727204143.79703: generating all_blocks data 16142 1727204143.79705: done generating all_blocks data 16142 1727204143.79712: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204143.79713: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204143.79716: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204143.80375: done processing included file 16142 1727204143.80378: iterating over new_blocks loaded from include file 16142 1727204143.80379: in VariableManager get_vars() 16142 1727204143.80415: done with get_vars() 16142 1727204143.80417: filtering new block on tags 16142 1727204143.80444: done filtering new block on tags 16142 1727204143.80447: in VariableManager get_vars() 16142 1727204143.80483: done with get_vars() 16142 1727204143.80485: filtering new block on tags 16142 1727204143.80508: done filtering new block on tags 16142 1727204143.80510: in VariableManager get_vars() 16142 1727204143.80551: done with get_vars() 16142 1727204143.80553: filtering new block on tags 16142 1727204143.80575: done filtering new block on tags 16142 1727204143.80578: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 16142 1727204143.80583: extending task lists for all hosts with included blocks 16142 1727204143.81483: done extending task lists 16142 1727204143.81484: done processing included files 16142 1727204143.81485: results queue empty 16142 1727204143.81486: checking for any_errors_fatal 16142 1727204143.81487: done checking for any_errors_fatal 16142 1727204143.81488: checking for max_fail_percentage 16142 1727204143.81489: done checking for max_fail_percentage 16142 1727204143.81490: checking to see if all hosts have failed and the running result is not ok 16142 1727204143.81491: done checking to see if all hosts have failed 16142 1727204143.81491: getting the remaining hosts for this loop 16142 1727204143.81492: done getting the remaining hosts for this loop 16142 1727204143.81495: getting the next task for host managed-node2 16142 1727204143.81499: done getting next task for host managed-node2 16142 1727204143.81502: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204143.81504: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204143.81519: getting variables 16142 1727204143.81520: in VariableManager get_vars() 16142 1727204143.81545: Calling all_inventory to load vars for managed-node2 16142 1727204143.81548: Calling groups_inventory to load vars for managed-node2 16142 1727204143.81550: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204143.81556: Calling all_plugins_play to load vars for managed-node2 16142 1727204143.81559: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204143.81562: Calling groups_plugins_play to load vars for managed-node2 16142 1727204143.82896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204143.84624: done with get_vars() 16142 1727204143.84663: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.168) 0:00:43.024 ***** 16142 1727204143.84753: entering _queue_task() for managed-node2/setup 16142 1727204143.85137: worker is 1 (out of 1 available) 16142 1727204143.85149: exiting _queue_task() for managed-node2/setup 16142 1727204143.85162: done queuing things up, now waiting for results queue to drain 16142 1727204143.85165: waiting for pending results... 16142 1727204143.85492: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204143.85672: in run() - task 0affcd87-79f5-fddd-f6c7-00000000084f 16142 1727204143.85694: variable 'ansible_search_path' from source: unknown 16142 1727204143.85701: variable 'ansible_search_path' from source: unknown 16142 1727204143.85754: calling self._execute() 16142 1727204143.85860: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204143.85875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204143.85891: variable 'omit' from source: magic vars 16142 1727204143.86306: variable 'ansible_distribution_major_version' from source: facts 16142 1727204143.86324: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204143.86573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204143.91375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204143.91478: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204143.91536: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204143.91581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204143.91623: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204143.91711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204143.91756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204143.91787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204143.91844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204143.91868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204143.91924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204143.91968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204143.91998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204143.92051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204143.92076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204143.93022: variable '__network_required_facts' from source: role '' defaults 16142 1727204143.93155: variable 'ansible_facts' from source: unknown 16142 1727204143.94979: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 16142 1727204143.94988: when evaluation is False, skipping this task 16142 1727204143.94995: _execute() done 16142 1727204143.95002: dumping result to json 16142 1727204143.95010: done dumping result, returning 16142 1727204143.95031: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-fddd-f6c7-00000000084f] 16142 1727204143.95073: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000084f skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204143.95226: no more pending results, returning what we have 16142 1727204143.95232: results queue empty 16142 1727204143.95233: checking for any_errors_fatal 16142 1727204143.95234: done checking for any_errors_fatal 16142 1727204143.95237: checking for max_fail_percentage 16142 1727204143.95240: done checking for max_fail_percentage 16142 1727204143.95241: checking to see if all hosts have failed and the running result is not ok 16142 1727204143.95241: done checking to see if all hosts have failed 16142 1727204143.95242: getting the remaining hosts for this loop 16142 1727204143.95244: done getting the remaining hosts for this loop 16142 1727204143.95248: getting the next task for host managed-node2 16142 1727204143.95258: done getting next task for host managed-node2 16142 1727204143.95262: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204143.95268: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204143.95290: getting variables 16142 1727204143.95292: in VariableManager get_vars() 16142 1727204143.95351: Calling all_inventory to load vars for managed-node2 16142 1727204143.95354: Calling groups_inventory to load vars for managed-node2 16142 1727204143.95357: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204143.95376: Calling all_plugins_play to load vars for managed-node2 16142 1727204143.95380: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204143.95383: Calling groups_plugins_play to load vars for managed-node2 16142 1727204143.96579: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000084f 16142 1727204143.96583: WORKER PROCESS EXITING 16142 1727204143.99243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204144.03487: done with get_vars() 16142 1727204144.03527: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.190) 0:00:43.214 ***** 16142 1727204144.03773: entering _queue_task() for managed-node2/stat 16142 1727204144.04807: worker is 1 (out of 1 available) 16142 1727204144.04821: exiting _queue_task() for managed-node2/stat 16142 1727204144.04837: done queuing things up, now waiting for results queue to drain 16142 1727204144.04839: waiting for pending results... 16142 1727204144.05040: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204144.05190: in run() - task 0affcd87-79f5-fddd-f6c7-000000000851 16142 1727204144.05204: variable 'ansible_search_path' from source: unknown 16142 1727204144.05209: variable 'ansible_search_path' from source: unknown 16142 1727204144.05248: calling self._execute() 16142 1727204144.05345: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204144.05349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204144.05360: variable 'omit' from source: magic vars 16142 1727204144.05970: variable 'ansible_distribution_major_version' from source: facts 16142 1727204144.05974: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204144.06112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204144.06411: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204144.06704: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204144.06740: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204144.06768: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204144.06852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204144.06879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204144.06902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204144.06925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204144.07017: variable '__network_is_ostree' from source: set_fact 16142 1727204144.07025: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204144.07029: when evaluation is False, skipping this task 16142 1727204144.07031: _execute() done 16142 1727204144.07034: dumping result to json 16142 1727204144.07039: done dumping result, returning 16142 1727204144.07046: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-fddd-f6c7-000000000851] 16142 1727204144.07052: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000851 16142 1727204144.07149: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000851 16142 1727204144.07151: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204144.07238: no more pending results, returning what we have 16142 1727204144.07242: results queue empty 16142 1727204144.07243: checking for any_errors_fatal 16142 1727204144.07250: done checking for any_errors_fatal 16142 1727204144.07251: checking for max_fail_percentage 16142 1727204144.07253: done checking for max_fail_percentage 16142 1727204144.07254: checking to see if all hosts have failed and the running result is not ok 16142 1727204144.07255: done checking to see if all hosts have failed 16142 1727204144.07255: getting the remaining hosts for this loop 16142 1727204144.07257: done getting the remaining hosts for this loop 16142 1727204144.07261: getting the next task for host managed-node2 16142 1727204144.07270: done getting next task for host managed-node2 16142 1727204144.07276: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204144.07280: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204144.07300: getting variables 16142 1727204144.07302: in VariableManager get_vars() 16142 1727204144.07367: Calling all_inventory to load vars for managed-node2 16142 1727204144.07371: Calling groups_inventory to load vars for managed-node2 16142 1727204144.07373: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204144.07384: Calling all_plugins_play to load vars for managed-node2 16142 1727204144.07387: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204144.07391: Calling groups_plugins_play to load vars for managed-node2 16142 1727204144.09822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204144.12785: done with get_vars() 16142 1727204144.12812: done getting variables 16142 1727204144.12888: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.091) 0:00:43.306 ***** 16142 1727204144.12929: entering _queue_task() for managed-node2/set_fact 16142 1727204144.13433: worker is 1 (out of 1 available) 16142 1727204144.13448: exiting _queue_task() for managed-node2/set_fact 16142 1727204144.13461: done queuing things up, now waiting for results queue to drain 16142 1727204144.13463: waiting for pending results... 16142 1727204144.15632: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204144.15866: in run() - task 0affcd87-79f5-fddd-f6c7-000000000852 16142 1727204144.15873: variable 'ansible_search_path' from source: unknown 16142 1727204144.15876: variable 'ansible_search_path' from source: unknown 16142 1727204144.15884: calling self._execute() 16142 1727204144.15982: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204144.15986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204144.15996: variable 'omit' from source: magic vars 16142 1727204144.16728: variable 'ansible_distribution_major_version' from source: facts 16142 1727204144.16733: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204144.16739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204144.17277: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204144.17281: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204144.17295: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204144.17327: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204144.17869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204144.17872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204144.17875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204144.17877: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204144.17879: variable '__network_is_ostree' from source: set_fact 16142 1727204144.17880: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204144.17882: when evaluation is False, skipping this task 16142 1727204144.17884: _execute() done 16142 1727204144.17885: dumping result to json 16142 1727204144.17887: done dumping result, returning 16142 1727204144.17889: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-fddd-f6c7-000000000852] 16142 1727204144.17891: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000852 16142 1727204144.17961: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000852 16142 1727204144.17963: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204144.18005: no more pending results, returning what we have 16142 1727204144.18008: results queue empty 16142 1727204144.18009: checking for any_errors_fatal 16142 1727204144.18015: done checking for any_errors_fatal 16142 1727204144.18016: checking for max_fail_percentage 16142 1727204144.18018: done checking for max_fail_percentage 16142 1727204144.18018: checking to see if all hosts have failed and the running result is not ok 16142 1727204144.18019: done checking to see if all hosts have failed 16142 1727204144.18020: getting the remaining hosts for this loop 16142 1727204144.18021: done getting the remaining hosts for this loop 16142 1727204144.18024: getting the next task for host managed-node2 16142 1727204144.18033: done getting next task for host managed-node2 16142 1727204144.18037: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204144.18040: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204144.18057: getting variables 16142 1727204144.18059: in VariableManager get_vars() 16142 1727204144.18162: Calling all_inventory to load vars for managed-node2 16142 1727204144.18168: Calling groups_inventory to load vars for managed-node2 16142 1727204144.18171: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204144.18179: Calling all_plugins_play to load vars for managed-node2 16142 1727204144.18182: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204144.18185: Calling groups_plugins_play to load vars for managed-node2 16142 1727204144.20408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204144.23143: done with get_vars() 16142 1727204144.23295: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.105) 0:00:43.411 ***** 16142 1727204144.23512: entering _queue_task() for managed-node2/service_facts 16142 1727204144.24247: worker is 1 (out of 1 available) 16142 1727204144.24267: exiting _queue_task() for managed-node2/service_facts 16142 1727204144.24281: done queuing things up, now waiting for results queue to drain 16142 1727204144.24283: waiting for pending results... 16142 1727204144.24658: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204144.24900: in run() - task 0affcd87-79f5-fddd-f6c7-000000000854 16142 1727204144.24914: variable 'ansible_search_path' from source: unknown 16142 1727204144.24918: variable 'ansible_search_path' from source: unknown 16142 1727204144.24968: calling self._execute() 16142 1727204144.25404: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204144.25415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204144.25437: variable 'omit' from source: magic vars 16142 1727204144.25839: variable 'ansible_distribution_major_version' from source: facts 16142 1727204144.25860: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204144.25878: variable 'omit' from source: magic vars 16142 1727204144.25957: variable 'omit' from source: magic vars 16142 1727204144.26010: variable 'omit' from source: magic vars 16142 1727204144.26056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204144.26099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204144.26122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204144.26144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204144.26159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204144.26196: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204144.26204: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204144.26210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204144.26320: Set connection var ansible_timeout to 10 16142 1727204144.26328: Set connection var ansible_connection to ssh 16142 1727204144.26341: Set connection var ansible_shell_type to sh 16142 1727204144.26351: Set connection var ansible_shell_executable to /bin/sh 16142 1727204144.26359: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204144.26372: Set connection var ansible_pipelining to False 16142 1727204144.26401: variable 'ansible_shell_executable' from source: unknown 16142 1727204144.26411: variable 'ansible_connection' from source: unknown 16142 1727204144.26419: variable 'ansible_module_compression' from source: unknown 16142 1727204144.26425: variable 'ansible_shell_type' from source: unknown 16142 1727204144.26431: variable 'ansible_shell_executable' from source: unknown 16142 1727204144.26440: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204144.26447: variable 'ansible_pipelining' from source: unknown 16142 1727204144.26453: variable 'ansible_timeout' from source: unknown 16142 1727204144.26460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204144.26674: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204144.26690: variable 'omit' from source: magic vars 16142 1727204144.26700: starting attempt loop 16142 1727204144.26706: running the handler 16142 1727204144.26723: _low_level_execute_command(): starting 16142 1727204144.26741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204144.27497: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204144.27513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.27525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.27546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.27589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.27601: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204144.27617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.27634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204144.27646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204144.27655: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204144.27666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.27677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.27689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.27699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.27713: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204144.27732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.27815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204144.27843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204144.27858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204144.27930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204144.29585: stdout chunk (state=3): >>>/root <<< 16142 1727204144.29798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204144.29802: stdout chunk (state=3): >>><<< 16142 1727204144.29805: stderr chunk (state=3): >>><<< 16142 1727204144.29927: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204144.29932: _low_level_execute_command(): starting 16142 1727204144.29938: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023 `" && echo ansible-tmp-1727204144.298255-19248-226730297451023="` echo /root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023 `" ) && sleep 0' 16142 1727204144.30617: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204144.30632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.30653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.30675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.30730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.30747: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204144.30761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.30782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204144.30801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204144.30815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204144.30829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.30850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.30870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.30885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.30897: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204144.30917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.31002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204144.31032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204144.31056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204144.31148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204144.33002: stdout chunk (state=3): >>>ansible-tmp-1727204144.298255-19248-226730297451023=/root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023 <<< 16142 1727204144.33124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204144.33221: stderr chunk (state=3): >>><<< 16142 1727204144.33233: stdout chunk (state=3): >>><<< 16142 1727204144.33273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204144.298255-19248-226730297451023=/root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204144.33472: variable 'ansible_module_compression' from source: unknown 16142 1727204144.33476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 16142 1727204144.33478: variable 'ansible_facts' from source: unknown 16142 1727204144.33507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023/AnsiballZ_service_facts.py 16142 1727204144.33677: Sending initial data 16142 1727204144.33680: Sent initial data (161 bytes) 16142 1727204144.34709: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204144.34723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.34738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.34753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.34800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.34810: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204144.34823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.34840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204144.34850: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204144.34859: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204144.34871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.34884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.34899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.34908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.34916: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204144.34926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.35005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204144.35024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204144.35040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204144.35107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204144.36816: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204144.36851: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204144.36917: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp07j_ybwl /root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023/AnsiballZ_service_facts.py <<< 16142 1727204144.36933: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204144.38066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204144.38274: stderr chunk (state=3): >>><<< 16142 1727204144.38277: stdout chunk (state=3): >>><<< 16142 1727204144.38279: done transferring module to remote 16142 1727204144.38289: _low_level_execute_command(): starting 16142 1727204144.38292: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023/ /root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023/AnsiballZ_service_facts.py && sleep 0' 16142 1727204144.38946: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204144.38962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.38981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.38999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.39052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.39066: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204144.39171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.39404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204144.39412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204144.39415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.39490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204144.39493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204144.39506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204144.39583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204144.41395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204144.41432: stderr chunk (state=3): >>><<< 16142 1727204144.41438: stdout chunk (state=3): >>><<< 16142 1727204144.41566: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204144.41570: _low_level_execute_command(): starting 16142 1727204144.41573: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023/AnsiballZ_service_facts.py && sleep 0' 16142 1727204144.42275: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204144.42291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.42307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.42334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.42384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.42398: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204144.42414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.42435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204144.42457: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204144.42471: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204144.42484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204144.42499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204144.42515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204144.42526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204144.42542: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204144.42568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204144.42645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204144.42679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204144.42698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204144.42788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204145.72072: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 16142 1727204145.72098: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 16142 1727204145.72137: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.s<<< 16142 1727204145.72160: stdout chunk (state=3): >>>ervice", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 16142 1727204145.73489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204145.73492: stdout chunk (state=3): >>><<< 16142 1727204145.73494: stderr chunk (state=3): >>><<< 16142 1727204145.74071: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204145.74207: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204145.74225: _low_level_execute_command(): starting 16142 1727204145.74233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204144.298255-19248-226730297451023/ > /dev/null 2>&1 && sleep 0' 16142 1727204145.74871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204145.74887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204145.74902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204145.74920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204145.74963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204145.74980: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204145.74997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204145.75016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204145.75029: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204145.75041: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204145.75054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204145.75070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204145.75086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204145.75099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204145.75111: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204145.75125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204145.75206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204145.75224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204145.75239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204145.75315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204145.77184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204145.77204: stderr chunk (state=3): >>><<< 16142 1727204145.77207: stdout chunk (state=3): >>><<< 16142 1727204145.77226: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204145.77232: handler run complete 16142 1727204145.77423: variable 'ansible_facts' from source: unknown 16142 1727204145.77945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204145.78408: variable 'ansible_facts' from source: unknown 16142 1727204145.78542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204145.78750: attempt loop complete, returning result 16142 1727204145.78754: _execute() done 16142 1727204145.78756: dumping result to json 16142 1727204145.78825: done dumping result, returning 16142 1727204145.78835: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-fddd-f6c7-000000000854] 16142 1727204145.78843: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000854 16142 1727204145.79806: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000854 16142 1727204145.79810: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204145.79885: no more pending results, returning what we have 16142 1727204145.79888: results queue empty 16142 1727204145.79889: checking for any_errors_fatal 16142 1727204145.79893: done checking for any_errors_fatal 16142 1727204145.79893: checking for max_fail_percentage 16142 1727204145.79895: done checking for max_fail_percentage 16142 1727204145.79896: checking to see if all hosts have failed and the running result is not ok 16142 1727204145.79896: done checking to see if all hosts have failed 16142 1727204145.79897: getting the remaining hosts for this loop 16142 1727204145.79898: done getting the remaining hosts for this loop 16142 1727204145.79902: getting the next task for host managed-node2 16142 1727204145.79907: done getting next task for host managed-node2 16142 1727204145.79911: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204145.79915: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204145.79925: getting variables 16142 1727204145.79926: in VariableManager get_vars() 16142 1727204145.79973: Calling all_inventory to load vars for managed-node2 16142 1727204145.79976: Calling groups_inventory to load vars for managed-node2 16142 1727204145.79978: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204145.79987: Calling all_plugins_play to load vars for managed-node2 16142 1727204145.79990: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204145.79998: Calling groups_plugins_play to load vars for managed-node2 16142 1727204145.81673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204145.83357: done with get_vars() 16142 1727204145.83387: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:45 -0400 (0:00:01.599) 0:00:45.011 ***** 16142 1727204145.83485: entering _queue_task() for managed-node2/package_facts 16142 1727204145.83831: worker is 1 (out of 1 available) 16142 1727204145.83846: exiting _queue_task() for managed-node2/package_facts 16142 1727204145.83859: done queuing things up, now waiting for results queue to drain 16142 1727204145.83861: waiting for pending results... 16142 1727204145.84154: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204145.84323: in run() - task 0affcd87-79f5-fddd-f6c7-000000000855 16142 1727204145.84345: variable 'ansible_search_path' from source: unknown 16142 1727204145.84354: variable 'ansible_search_path' from source: unknown 16142 1727204145.84397: calling self._execute() 16142 1727204145.84494: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204145.84505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204145.84519: variable 'omit' from source: magic vars 16142 1727204145.84920: variable 'ansible_distribution_major_version' from source: facts 16142 1727204145.84939: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204145.84952: variable 'omit' from source: magic vars 16142 1727204145.85062: variable 'omit' from source: magic vars 16142 1727204145.85108: variable 'omit' from source: magic vars 16142 1727204145.85155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204145.85202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204145.85232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204145.85255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204145.85274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204145.85315: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204145.85323: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204145.85331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204145.85462: Set connection var ansible_timeout to 10 16142 1727204145.85474: Set connection var ansible_connection to ssh 16142 1727204145.85486: Set connection var ansible_shell_type to sh 16142 1727204145.85497: Set connection var ansible_shell_executable to /bin/sh 16142 1727204145.85512: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204145.85524: Set connection var ansible_pipelining to False 16142 1727204145.85551: variable 'ansible_shell_executable' from source: unknown 16142 1727204145.85559: variable 'ansible_connection' from source: unknown 16142 1727204145.85573: variable 'ansible_module_compression' from source: unknown 16142 1727204145.85595: variable 'ansible_shell_type' from source: unknown 16142 1727204145.85604: variable 'ansible_shell_executable' from source: unknown 16142 1727204145.85617: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204145.85626: variable 'ansible_pipelining' from source: unknown 16142 1727204145.85632: variable 'ansible_timeout' from source: unknown 16142 1727204145.85641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204145.85850: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204145.85888: variable 'omit' from source: magic vars 16142 1727204145.85898: starting attempt loop 16142 1727204145.85905: running the handler 16142 1727204145.85924: _low_level_execute_command(): starting 16142 1727204145.85939: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204145.86740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204145.86756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204145.86779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204145.86801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204145.86849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204145.86862: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204145.86893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204145.86918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204145.86932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204145.86946: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204145.86960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204145.86978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204145.86996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204145.87011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204145.87028: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204145.87044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204145.87128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204145.87157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204145.87172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204145.87249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204145.88885: stdout chunk (state=3): >>>/root <<< 16142 1727204145.89100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204145.89104: stdout chunk (state=3): >>><<< 16142 1727204145.89106: stderr chunk (state=3): >>><<< 16142 1727204145.89254: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204145.89257: _low_level_execute_command(): starting 16142 1727204145.89260: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509 `" && echo ansible-tmp-1727204145.8915565-19452-39993114538509="` echo /root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509 `" ) && sleep 0' 16142 1727204145.89989: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204145.90002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204145.90017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204145.90040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204145.90091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204145.90103: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204145.90116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204145.90132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204145.90150: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204145.90163: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204145.90178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204145.90192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204145.90208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204145.90220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204145.90232: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204145.90246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204145.90326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204145.90343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204145.90359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204145.90494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204145.92345: stdout chunk (state=3): >>>ansible-tmp-1727204145.8915565-19452-39993114538509=/root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509 <<< 16142 1727204145.92499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204145.92577: stderr chunk (state=3): >>><<< 16142 1727204145.92580: stdout chunk (state=3): >>><<< 16142 1727204145.92776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204145.8915565-19452-39993114538509=/root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204145.92780: variable 'ansible_module_compression' from source: unknown 16142 1727204145.92782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 16142 1727204145.92785: variable 'ansible_facts' from source: unknown 16142 1727204145.92999: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509/AnsiballZ_package_facts.py 16142 1727204145.93256: Sending initial data 16142 1727204145.93259: Sent initial data (161 bytes) 16142 1727204145.94393: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204145.94412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204145.94433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204145.94455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204145.94501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204145.94516: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204145.94542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204145.94562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204145.94578: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204145.94591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204145.94604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204145.94619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204145.94643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204145.94665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204145.94679: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204145.94693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204145.94797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204145.94828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204145.94851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204145.94924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204145.96645: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204145.96713: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204145.96751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp53ypv44p /root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509/AnsiballZ_package_facts.py <<< 16142 1727204145.97095: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204145.99969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204146.00108: stderr chunk (state=3): >>><<< 16142 1727204146.00111: stdout chunk (state=3): >>><<< 16142 1727204146.00114: done transferring module to remote 16142 1727204146.00119: _low_level_execute_command(): starting 16142 1727204146.00122: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509/ /root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509/AnsiballZ_package_facts.py && sleep 0' 16142 1727204146.00818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204146.00828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204146.00841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204146.00853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204146.00899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204146.00906: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204146.00916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204146.00929: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204146.00939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204146.00942: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204146.00951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204146.00960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204146.00973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204146.00983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204146.00990: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204146.01000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204146.01072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204146.01085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204146.01095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204146.01256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204146.03090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204146.03157: stderr chunk (state=3): >>><<< 16142 1727204146.03160: stdout chunk (state=3): >>><<< 16142 1727204146.03179: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204146.03182: _low_level_execute_command(): starting 16142 1727204146.03187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509/AnsiballZ_package_facts.py && sleep 0' 16142 1727204146.04130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204146.04139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204146.04241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204146.04247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204146.04262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204146.04270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204146.04275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204146.04280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204146.04285: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204146.04298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204146.04492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204146.04517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204146.04567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204146.50941: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 16142 1727204146.51014: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 16142 1727204146.52576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204146.52580: stdout chunk (state=3): >>><<< 16142 1727204146.52583: stderr chunk (state=3): >>><<< 16142 1727204146.52675: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204146.55183: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204146.55217: _low_level_execute_command(): starting 16142 1727204146.55228: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204145.8915565-19452-39993114538509/ > /dev/null 2>&1 && sleep 0' 16142 1727204146.56339: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204146.56358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204146.56377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204146.56396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204146.56441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204146.56459: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204146.56477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204146.56496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204146.56509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204146.56519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204146.56530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204146.56547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204146.56571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204146.56588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204146.56601: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204146.56616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204146.56699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204146.56722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204146.56739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204146.56813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204146.58728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204146.58732: stdout chunk (state=3): >>><<< 16142 1727204146.58734: stderr chunk (state=3): >>><<< 16142 1727204146.59201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204146.59205: handler run complete 16142 1727204146.61114: variable 'ansible_facts' from source: unknown 16142 1727204146.61720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204146.64833: variable 'ansible_facts' from source: unknown 16142 1727204146.65600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204146.66391: attempt loop complete, returning result 16142 1727204146.66414: _execute() done 16142 1727204146.66422: dumping result to json 16142 1727204146.66666: done dumping result, returning 16142 1727204146.66683: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-fddd-f6c7-000000000855] 16142 1727204146.66694: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000855 16142 1727204146.69278: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000855 16142 1727204146.69282: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204146.69418: no more pending results, returning what we have 16142 1727204146.69421: results queue empty 16142 1727204146.69422: checking for any_errors_fatal 16142 1727204146.69426: done checking for any_errors_fatal 16142 1727204146.69427: checking for max_fail_percentage 16142 1727204146.69429: done checking for max_fail_percentage 16142 1727204146.69429: checking to see if all hosts have failed and the running result is not ok 16142 1727204146.69430: done checking to see if all hosts have failed 16142 1727204146.69431: getting the remaining hosts for this loop 16142 1727204146.69432: done getting the remaining hosts for this loop 16142 1727204146.69436: getting the next task for host managed-node2 16142 1727204146.69442: done getting next task for host managed-node2 16142 1727204146.69446: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204146.69449: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204146.69461: getting variables 16142 1727204146.69462: in VariableManager get_vars() 16142 1727204146.69513: Calling all_inventory to load vars for managed-node2 16142 1727204146.69516: Calling groups_inventory to load vars for managed-node2 16142 1727204146.69518: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204146.69528: Calling all_plugins_play to load vars for managed-node2 16142 1727204146.69530: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204146.69534: Calling groups_plugins_play to load vars for managed-node2 16142 1727204146.70815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204146.72537: done with get_vars() 16142 1727204146.72579: done getting variables 16142 1727204146.72649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.891) 0:00:45.903 ***** 16142 1727204146.72688: entering _queue_task() for managed-node2/debug 16142 1727204146.73043: worker is 1 (out of 1 available) 16142 1727204146.73062: exiting _queue_task() for managed-node2/debug 16142 1727204146.73077: done queuing things up, now waiting for results queue to drain 16142 1727204146.73078: waiting for pending results... 16142 1727204146.73394: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204146.73537: in run() - task 0affcd87-79f5-fddd-f6c7-00000000011c 16142 1727204146.73550: variable 'ansible_search_path' from source: unknown 16142 1727204146.73553: variable 'ansible_search_path' from source: unknown 16142 1727204146.73590: calling self._execute() 16142 1727204146.73684: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204146.73687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204146.73698: variable 'omit' from source: magic vars 16142 1727204146.74080: variable 'ansible_distribution_major_version' from source: facts 16142 1727204146.74092: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204146.74098: variable 'omit' from source: magic vars 16142 1727204146.74161: variable 'omit' from source: magic vars 16142 1727204146.74262: variable 'network_provider' from source: set_fact 16142 1727204146.74285: variable 'omit' from source: magic vars 16142 1727204146.74325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204146.74355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204146.74380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204146.74400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204146.74410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204146.74439: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204146.74442: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204146.74444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204146.74552: Set connection var ansible_timeout to 10 16142 1727204146.74555: Set connection var ansible_connection to ssh 16142 1727204146.74560: Set connection var ansible_shell_type to sh 16142 1727204146.74567: Set connection var ansible_shell_executable to /bin/sh 16142 1727204146.74574: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204146.74587: Set connection var ansible_pipelining to False 16142 1727204146.74614: variable 'ansible_shell_executable' from source: unknown 16142 1727204146.74618: variable 'ansible_connection' from source: unknown 16142 1727204146.74620: variable 'ansible_module_compression' from source: unknown 16142 1727204146.74623: variable 'ansible_shell_type' from source: unknown 16142 1727204146.74625: variable 'ansible_shell_executable' from source: unknown 16142 1727204146.74627: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204146.74629: variable 'ansible_pipelining' from source: unknown 16142 1727204146.74631: variable 'ansible_timeout' from source: unknown 16142 1727204146.74639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204146.74785: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204146.74799: variable 'omit' from source: magic vars 16142 1727204146.74805: starting attempt loop 16142 1727204146.74808: running the handler 16142 1727204146.74857: handler run complete 16142 1727204146.74873: attempt loop complete, returning result 16142 1727204146.74876: _execute() done 16142 1727204146.74879: dumping result to json 16142 1727204146.74881: done dumping result, returning 16142 1727204146.74889: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-fddd-f6c7-00000000011c] 16142 1727204146.74895: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011c 16142 1727204146.74993: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011c 16142 1727204146.74997: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 16142 1727204146.75073: no more pending results, returning what we have 16142 1727204146.75077: results queue empty 16142 1727204146.75078: checking for any_errors_fatal 16142 1727204146.75091: done checking for any_errors_fatal 16142 1727204146.75092: checking for max_fail_percentage 16142 1727204146.75093: done checking for max_fail_percentage 16142 1727204146.75094: checking to see if all hosts have failed and the running result is not ok 16142 1727204146.75095: done checking to see if all hosts have failed 16142 1727204146.75096: getting the remaining hosts for this loop 16142 1727204146.75098: done getting the remaining hosts for this loop 16142 1727204146.75102: getting the next task for host managed-node2 16142 1727204146.75110: done getting next task for host managed-node2 16142 1727204146.75115: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204146.75118: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204146.75131: getting variables 16142 1727204146.75134: in VariableManager get_vars() 16142 1727204146.75198: Calling all_inventory to load vars for managed-node2 16142 1727204146.75201: Calling groups_inventory to load vars for managed-node2 16142 1727204146.75204: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204146.75214: Calling all_plugins_play to load vars for managed-node2 16142 1727204146.75218: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204146.75221: Calling groups_plugins_play to load vars for managed-node2 16142 1727204146.77031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204146.78782: done with get_vars() 16142 1727204146.78817: done getting variables 16142 1727204146.78888: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.062) 0:00:45.966 ***** 16142 1727204146.78926: entering _queue_task() for managed-node2/fail 16142 1727204146.79284: worker is 1 (out of 1 available) 16142 1727204146.79301: exiting _queue_task() for managed-node2/fail 16142 1727204146.79313: done queuing things up, now waiting for results queue to drain 16142 1727204146.79314: waiting for pending results... 16142 1727204146.79611: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204146.79743: in run() - task 0affcd87-79f5-fddd-f6c7-00000000011d 16142 1727204146.79760: variable 'ansible_search_path' from source: unknown 16142 1727204146.79766: variable 'ansible_search_path' from source: unknown 16142 1727204146.79800: calling self._execute() 16142 1727204146.79899: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204146.79904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204146.79914: variable 'omit' from source: magic vars 16142 1727204146.80299: variable 'ansible_distribution_major_version' from source: facts 16142 1727204146.80314: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204146.80442: variable 'network_state' from source: role '' defaults 16142 1727204146.80452: Evaluated conditional (network_state != {}): False 16142 1727204146.80455: when evaluation is False, skipping this task 16142 1727204146.80458: _execute() done 16142 1727204146.80461: dumping result to json 16142 1727204146.80466: done dumping result, returning 16142 1727204146.80475: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-fddd-f6c7-00000000011d] 16142 1727204146.80481: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011d 16142 1727204146.80587: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011d 16142 1727204146.80590: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204146.80670: no more pending results, returning what we have 16142 1727204146.80675: results queue empty 16142 1727204146.80676: checking for any_errors_fatal 16142 1727204146.80684: done checking for any_errors_fatal 16142 1727204146.80685: checking for max_fail_percentage 16142 1727204146.80688: done checking for max_fail_percentage 16142 1727204146.80689: checking to see if all hosts have failed and the running result is not ok 16142 1727204146.80690: done checking to see if all hosts have failed 16142 1727204146.80690: getting the remaining hosts for this loop 16142 1727204146.80692: done getting the remaining hosts for this loop 16142 1727204146.80696: getting the next task for host managed-node2 16142 1727204146.80705: done getting next task for host managed-node2 16142 1727204146.80710: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204146.80713: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204146.80736: getting variables 16142 1727204146.80739: in VariableManager get_vars() 16142 1727204146.80796: Calling all_inventory to load vars for managed-node2 16142 1727204146.80799: Calling groups_inventory to load vars for managed-node2 16142 1727204146.80801: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204146.80814: Calling all_plugins_play to load vars for managed-node2 16142 1727204146.80818: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204146.80821: Calling groups_plugins_play to load vars for managed-node2 16142 1727204146.82739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204146.84621: done with get_vars() 16142 1727204146.84654: done getting variables 16142 1727204146.84729: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.058) 0:00:46.024 ***** 16142 1727204146.84776: entering _queue_task() for managed-node2/fail 16142 1727204146.85136: worker is 1 (out of 1 available) 16142 1727204146.85149: exiting _queue_task() for managed-node2/fail 16142 1727204146.85162: done queuing things up, now waiting for results queue to drain 16142 1727204146.85167: waiting for pending results... 16142 1727204146.85483: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204146.85629: in run() - task 0affcd87-79f5-fddd-f6c7-00000000011e 16142 1727204146.85647: variable 'ansible_search_path' from source: unknown 16142 1727204146.85651: variable 'ansible_search_path' from source: unknown 16142 1727204146.85690: calling self._execute() 16142 1727204146.85793: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204146.85796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204146.85927: variable 'omit' from source: magic vars 16142 1727204146.87710: variable 'ansible_distribution_major_version' from source: facts 16142 1727204146.87722: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204146.87969: variable 'network_state' from source: role '' defaults 16142 1727204146.88084: Evaluated conditional (network_state != {}): False 16142 1727204146.88088: when evaluation is False, skipping this task 16142 1727204146.88091: _execute() done 16142 1727204146.88095: dumping result to json 16142 1727204146.88098: done dumping result, returning 16142 1727204146.88104: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-fddd-f6c7-00000000011e] 16142 1727204146.88111: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011e 16142 1727204146.88273: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011e 16142 1727204146.88277: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204146.88336: no more pending results, returning what we have 16142 1727204146.88341: results queue empty 16142 1727204146.88342: checking for any_errors_fatal 16142 1727204146.88352: done checking for any_errors_fatal 16142 1727204146.88353: checking for max_fail_percentage 16142 1727204146.88356: done checking for max_fail_percentage 16142 1727204146.88357: checking to see if all hosts have failed and the running result is not ok 16142 1727204146.88358: done checking to see if all hosts have failed 16142 1727204146.88359: getting the remaining hosts for this loop 16142 1727204146.88360: done getting the remaining hosts for this loop 16142 1727204146.88366: getting the next task for host managed-node2 16142 1727204146.88375: done getting next task for host managed-node2 16142 1727204146.88380: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204146.88383: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204146.88407: getting variables 16142 1727204146.88409: in VariableManager get_vars() 16142 1727204146.88474: Calling all_inventory to load vars for managed-node2 16142 1727204146.88478: Calling groups_inventory to load vars for managed-node2 16142 1727204146.88480: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204146.88493: Calling all_plugins_play to load vars for managed-node2 16142 1727204146.88496: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204146.88499: Calling groups_plugins_play to load vars for managed-node2 16142 1727204146.99583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.01450: done with get_vars() 16142 1727204147.02089: done getting variables 16142 1727204147.02141: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.174) 0:00:46.198 ***** 16142 1727204147.02175: entering _queue_task() for managed-node2/fail 16142 1727204147.02510: worker is 1 (out of 1 available) 16142 1727204147.02523: exiting _queue_task() for managed-node2/fail 16142 1727204147.02535: done queuing things up, now waiting for results queue to drain 16142 1727204147.02537: waiting for pending results... 16142 1727204147.03468: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204147.03901: in run() - task 0affcd87-79f5-fddd-f6c7-00000000011f 16142 1727204147.03923: variable 'ansible_search_path' from source: unknown 16142 1727204147.03930: variable 'ansible_search_path' from source: unknown 16142 1727204147.03979: calling self._execute() 16142 1727204147.04160: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.04291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.04305: variable 'omit' from source: magic vars 16142 1727204147.05038: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.05172: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.05469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204147.10115: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204147.10557: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204147.11301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204147.11348: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204147.11384: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204147.11477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.11513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.11544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.11593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.11617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.11727: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.11751: Evaluated conditional (ansible_distribution_major_version | int > 9): False 16142 1727204147.12101: when evaluation is False, skipping this task 16142 1727204147.12108: _execute() done 16142 1727204147.12115: dumping result to json 16142 1727204147.12122: done dumping result, returning 16142 1727204147.12135: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-fddd-f6c7-00000000011f] 16142 1727204147.12148: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011f 16142 1727204147.12271: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000011f 16142 1727204147.12279: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 16142 1727204147.12331: no more pending results, returning what we have 16142 1727204147.12335: results queue empty 16142 1727204147.12335: checking for any_errors_fatal 16142 1727204147.12342: done checking for any_errors_fatal 16142 1727204147.12343: checking for max_fail_percentage 16142 1727204147.12345: done checking for max_fail_percentage 16142 1727204147.12345: checking to see if all hosts have failed and the running result is not ok 16142 1727204147.12346: done checking to see if all hosts have failed 16142 1727204147.12347: getting the remaining hosts for this loop 16142 1727204147.12348: done getting the remaining hosts for this loop 16142 1727204147.12352: getting the next task for host managed-node2 16142 1727204147.12358: done getting next task for host managed-node2 16142 1727204147.12362: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204147.12372: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204147.12391: getting variables 16142 1727204147.12393: in VariableManager get_vars() 16142 1727204147.12447: Calling all_inventory to load vars for managed-node2 16142 1727204147.12450: Calling groups_inventory to load vars for managed-node2 16142 1727204147.12452: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204147.12462: Calling all_plugins_play to load vars for managed-node2 16142 1727204147.12467: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204147.12471: Calling groups_plugins_play to load vars for managed-node2 16142 1727204147.15002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.18491: done with get_vars() 16142 1727204147.18534: done getting variables 16142 1727204147.18602: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.164) 0:00:46.363 ***** 16142 1727204147.18647: entering _queue_task() for managed-node2/dnf 16142 1727204147.19343: worker is 1 (out of 1 available) 16142 1727204147.19358: exiting _queue_task() for managed-node2/dnf 16142 1727204147.19371: done queuing things up, now waiting for results queue to drain 16142 1727204147.19372: waiting for pending results... 16142 1727204147.19694: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204147.19856: in run() - task 0affcd87-79f5-fddd-f6c7-000000000120 16142 1727204147.19881: variable 'ansible_search_path' from source: unknown 16142 1727204147.19888: variable 'ansible_search_path' from source: unknown 16142 1727204147.19935: calling self._execute() 16142 1727204147.20045: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.20057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.20073: variable 'omit' from source: magic vars 16142 1727204147.20469: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.20488: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.20707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204147.23860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204147.23944: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204147.23996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204147.24037: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204147.24070: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204147.24158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.24191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.24219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.24271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.24290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.24524: variable 'ansible_distribution' from source: facts 16142 1727204147.24599: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.24622: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 16142 1727204147.24991: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204147.25260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.25292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.25347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.25443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.25470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.25518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.25552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.25588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.25634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.25659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.25709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.25737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.25776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.25824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.25842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.26026: variable 'network_connections' from source: task vars 16142 1727204147.26044: variable 'port1_profile' from source: play vars 16142 1727204147.26128: variable 'port1_profile' from source: play vars 16142 1727204147.26144: variable 'port2_profile' from source: play vars 16142 1727204147.26215: variable 'port2_profile' from source: play vars 16142 1727204147.26295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204147.26484: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204147.26531: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204147.26572: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204147.26620: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204147.26681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204147.26708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204147.26753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.26791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204147.26848: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204147.27131: variable 'network_connections' from source: task vars 16142 1727204147.27142: variable 'port1_profile' from source: play vars 16142 1727204147.27261: variable 'port1_profile' from source: play vars 16142 1727204147.27277: variable 'port2_profile' from source: play vars 16142 1727204147.27345: variable 'port2_profile' from source: play vars 16142 1727204147.27403: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204147.27414: when evaluation is False, skipping this task 16142 1727204147.27420: _execute() done 16142 1727204147.27427: dumping result to json 16142 1727204147.27437: done dumping result, returning 16142 1727204147.27451: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000120] 16142 1727204147.27462: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000120 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204147.27706: no more pending results, returning what we have 16142 1727204147.27711: results queue empty 16142 1727204147.27712: checking for any_errors_fatal 16142 1727204147.27718: done checking for any_errors_fatal 16142 1727204147.27719: checking for max_fail_percentage 16142 1727204147.27721: done checking for max_fail_percentage 16142 1727204147.27722: checking to see if all hosts have failed and the running result is not ok 16142 1727204147.27723: done checking to see if all hosts have failed 16142 1727204147.27724: getting the remaining hosts for this loop 16142 1727204147.27726: done getting the remaining hosts for this loop 16142 1727204147.27730: getting the next task for host managed-node2 16142 1727204147.27763: done getting next task for host managed-node2 16142 1727204147.27771: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204147.27774: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204147.27795: getting variables 16142 1727204147.27797: in VariableManager get_vars() 16142 1727204147.27857: Calling all_inventory to load vars for managed-node2 16142 1727204147.27861: Calling groups_inventory to load vars for managed-node2 16142 1727204147.27865: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204147.27876: Calling all_plugins_play to load vars for managed-node2 16142 1727204147.27878: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204147.27881: Calling groups_plugins_play to load vars for managed-node2 16142 1727204147.29139: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000120 16142 1727204147.29145: WORKER PROCESS EXITING 16142 1727204147.31544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.33958: done with get_vars() 16142 1727204147.33991: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204147.34069: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.154) 0:00:46.517 ***** 16142 1727204147.34105: entering _queue_task() for managed-node2/yum 16142 1727204147.34442: worker is 1 (out of 1 available) 16142 1727204147.34454: exiting _queue_task() for managed-node2/yum 16142 1727204147.34465: done queuing things up, now waiting for results queue to drain 16142 1727204147.34466: waiting for pending results... 16142 1727204147.34758: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204147.34918: in run() - task 0affcd87-79f5-fddd-f6c7-000000000121 16142 1727204147.34939: variable 'ansible_search_path' from source: unknown 16142 1727204147.34949: variable 'ansible_search_path' from source: unknown 16142 1727204147.34992: calling self._execute() 16142 1727204147.35109: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.35120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.35135: variable 'omit' from source: magic vars 16142 1727204147.35559: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.35584: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.35751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204147.38754: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204147.38954: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204147.39003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204147.39043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204147.39080: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204147.39222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.39323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.39355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.39406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.39426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.39548: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.39571: Evaluated conditional (ansible_distribution_major_version | int < 8): False 16142 1727204147.39579: when evaluation is False, skipping this task 16142 1727204147.39586: _execute() done 16142 1727204147.39597: dumping result to json 16142 1727204147.39605: done dumping result, returning 16142 1727204147.39617: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000121] 16142 1727204147.39628: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000121 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 16142 1727204147.39791: no more pending results, returning what we have 16142 1727204147.39796: results queue empty 16142 1727204147.39797: checking for any_errors_fatal 16142 1727204147.39803: done checking for any_errors_fatal 16142 1727204147.39804: checking for max_fail_percentage 16142 1727204147.39807: done checking for max_fail_percentage 16142 1727204147.39808: checking to see if all hosts have failed and the running result is not ok 16142 1727204147.39808: done checking to see if all hosts have failed 16142 1727204147.39809: getting the remaining hosts for this loop 16142 1727204147.39811: done getting the remaining hosts for this loop 16142 1727204147.39815: getting the next task for host managed-node2 16142 1727204147.39822: done getting next task for host managed-node2 16142 1727204147.39827: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204147.39830: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204147.39851: getting variables 16142 1727204147.39854: in VariableManager get_vars() 16142 1727204147.39914: Calling all_inventory to load vars for managed-node2 16142 1727204147.39917: Calling groups_inventory to load vars for managed-node2 16142 1727204147.39920: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204147.39930: Calling all_plugins_play to load vars for managed-node2 16142 1727204147.39933: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204147.39936: Calling groups_plugins_play to load vars for managed-node2 16142 1727204147.41104: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000121 16142 1727204147.41108: WORKER PROCESS EXITING 16142 1727204147.41666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.43505: done with get_vars() 16142 1727204147.43529: done getting variables 16142 1727204147.43608: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.095) 0:00:46.613 ***** 16142 1727204147.43648: entering _queue_task() for managed-node2/fail 16142 1727204147.43990: worker is 1 (out of 1 available) 16142 1727204147.44004: exiting _queue_task() for managed-node2/fail 16142 1727204147.44015: done queuing things up, now waiting for results queue to drain 16142 1727204147.44016: waiting for pending results... 16142 1727204147.44312: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204147.44455: in run() - task 0affcd87-79f5-fddd-f6c7-000000000122 16142 1727204147.44479: variable 'ansible_search_path' from source: unknown 16142 1727204147.44486: variable 'ansible_search_path' from source: unknown 16142 1727204147.44529: calling self._execute() 16142 1727204147.44635: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.44648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.44667: variable 'omit' from source: magic vars 16142 1727204147.45076: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.45095: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.45230: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204147.45439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204147.47828: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204147.47911: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204147.47956: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204147.47993: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204147.48023: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204147.48113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.48155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.48191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.48238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.48267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.48319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.48349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.48385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.48433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.48455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.48504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.48533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.48566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.48615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.48635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.48836: variable 'network_connections' from source: task vars 16142 1727204147.48854: variable 'port1_profile' from source: play vars 16142 1727204147.48936: variable 'port1_profile' from source: play vars 16142 1727204147.48953: variable 'port2_profile' from source: play vars 16142 1727204147.49027: variable 'port2_profile' from source: play vars 16142 1727204147.49104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204147.49303: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204147.49351: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204147.49388: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204147.49421: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204147.49479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204147.49507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204147.49536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.49576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204147.49633: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204147.49892: variable 'network_connections' from source: task vars 16142 1727204147.49902: variable 'port1_profile' from source: play vars 16142 1727204147.49969: variable 'port1_profile' from source: play vars 16142 1727204147.49984: variable 'port2_profile' from source: play vars 16142 1727204147.50049: variable 'port2_profile' from source: play vars 16142 1727204147.50081: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204147.50090: when evaluation is False, skipping this task 16142 1727204147.50102: _execute() done 16142 1727204147.50111: dumping result to json 16142 1727204147.50120: done dumping result, returning 16142 1727204147.50133: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000122] 16142 1727204147.50152: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000122 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204147.50322: no more pending results, returning what we have 16142 1727204147.50326: results queue empty 16142 1727204147.50327: checking for any_errors_fatal 16142 1727204147.50334: done checking for any_errors_fatal 16142 1727204147.50335: checking for max_fail_percentage 16142 1727204147.50337: done checking for max_fail_percentage 16142 1727204147.50338: checking to see if all hosts have failed and the running result is not ok 16142 1727204147.50339: done checking to see if all hosts have failed 16142 1727204147.50339: getting the remaining hosts for this loop 16142 1727204147.50341: done getting the remaining hosts for this loop 16142 1727204147.50345: getting the next task for host managed-node2 16142 1727204147.50353: done getting next task for host managed-node2 16142 1727204147.50358: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 16142 1727204147.50361: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204147.50385: getting variables 16142 1727204147.50387: in VariableManager get_vars() 16142 1727204147.50446: Calling all_inventory to load vars for managed-node2 16142 1727204147.50449: Calling groups_inventory to load vars for managed-node2 16142 1727204147.50452: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204147.50465: Calling all_plugins_play to load vars for managed-node2 16142 1727204147.50468: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204147.50471: Calling groups_plugins_play to load vars for managed-node2 16142 1727204147.51985: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000122 16142 1727204147.51989: WORKER PROCESS EXITING 16142 1727204147.52224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.53849: done with get_vars() 16142 1727204147.53880: done getting variables 16142 1727204147.53939: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.103) 0:00:46.716 ***** 16142 1727204147.53978: entering _queue_task() for managed-node2/package 16142 1727204147.54311: worker is 1 (out of 1 available) 16142 1727204147.54323: exiting _queue_task() for managed-node2/package 16142 1727204147.54334: done queuing things up, now waiting for results queue to drain 16142 1727204147.54336: waiting for pending results... 16142 1727204147.54634: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 16142 1727204147.54788: in run() - task 0affcd87-79f5-fddd-f6c7-000000000123 16142 1727204147.54807: variable 'ansible_search_path' from source: unknown 16142 1727204147.54817: variable 'ansible_search_path' from source: unknown 16142 1727204147.54859: calling self._execute() 16142 1727204147.54962: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.54976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.54990: variable 'omit' from source: magic vars 16142 1727204147.55386: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.55405: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.55616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204147.55894: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204147.55946: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204147.55992: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204147.56073: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204147.56192: variable 'network_packages' from source: role '' defaults 16142 1727204147.56309: variable '__network_provider_setup' from source: role '' defaults 16142 1727204147.56323: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204147.56384: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204147.56396: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204147.56457: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204147.56656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204147.58840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204147.58918: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204147.58959: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204147.58998: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204147.59031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204147.59447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.59486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.59516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.59565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.59586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.59631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.59654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.59685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.59726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.59744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.59983: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204147.60108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.60137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.60169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.60217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.60237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.60333: variable 'ansible_python' from source: facts 16142 1727204147.60368: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204147.60462: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204147.60551: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204147.60688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.60717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.60747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.60797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.60818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.60875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.60910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.60940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.60989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.61008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.61165: variable 'network_connections' from source: task vars 16142 1727204147.61176: variable 'port1_profile' from source: play vars 16142 1727204147.61288: variable 'port1_profile' from source: play vars 16142 1727204147.61309: variable 'port2_profile' from source: play vars 16142 1727204147.61418: variable 'port2_profile' from source: play vars 16142 1727204147.61493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204147.61529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204147.61568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.61604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204147.61661: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204147.61940: variable 'network_connections' from source: task vars 16142 1727204147.61953: variable 'port1_profile' from source: play vars 16142 1727204147.62061: variable 'port1_profile' from source: play vars 16142 1727204147.62079: variable 'port2_profile' from source: play vars 16142 1727204147.62184: variable 'port2_profile' from source: play vars 16142 1727204147.62220: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204147.62304: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204147.62636: variable 'network_connections' from source: task vars 16142 1727204147.62647: variable 'port1_profile' from source: play vars 16142 1727204147.62719: variable 'port1_profile' from source: play vars 16142 1727204147.62730: variable 'port2_profile' from source: play vars 16142 1727204147.62798: variable 'port2_profile' from source: play vars 16142 1727204147.62832: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204147.62915: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204147.63227: variable 'network_connections' from source: task vars 16142 1727204147.63236: variable 'port1_profile' from source: play vars 16142 1727204147.63301: variable 'port1_profile' from source: play vars 16142 1727204147.63315: variable 'port2_profile' from source: play vars 16142 1727204147.63391: variable 'port2_profile' from source: play vars 16142 1727204147.63450: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204147.63524: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204147.63538: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204147.63608: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204147.63834: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204147.64360: variable 'network_connections' from source: task vars 16142 1727204147.64373: variable 'port1_profile' from source: play vars 16142 1727204147.64436: variable 'port1_profile' from source: play vars 16142 1727204147.64454: variable 'port2_profile' from source: play vars 16142 1727204147.64518: variable 'port2_profile' from source: play vars 16142 1727204147.64530: variable 'ansible_distribution' from source: facts 16142 1727204147.64539: variable '__network_rh_distros' from source: role '' defaults 16142 1727204147.64549: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.64577: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204147.64748: variable 'ansible_distribution' from source: facts 16142 1727204147.64758: variable '__network_rh_distros' from source: role '' defaults 16142 1727204147.64772: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.64790: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204147.64957: variable 'ansible_distribution' from source: facts 16142 1727204147.64968: variable '__network_rh_distros' from source: role '' defaults 16142 1727204147.64978: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.65023: variable 'network_provider' from source: set_fact 16142 1727204147.65046: variable 'ansible_facts' from source: unknown 16142 1727204147.65789: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 16142 1727204147.65797: when evaluation is False, skipping this task 16142 1727204147.65803: _execute() done 16142 1727204147.65808: dumping result to json 16142 1727204147.65814: done dumping result, returning 16142 1727204147.65825: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-fddd-f6c7-000000000123] 16142 1727204147.65833: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000123 16142 1727204147.65940: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000123 16142 1727204147.65946: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 16142 1727204147.66006: no more pending results, returning what we have 16142 1727204147.66010: results queue empty 16142 1727204147.66011: checking for any_errors_fatal 16142 1727204147.66018: done checking for any_errors_fatal 16142 1727204147.66019: checking for max_fail_percentage 16142 1727204147.66021: done checking for max_fail_percentage 16142 1727204147.66022: checking to see if all hosts have failed and the running result is not ok 16142 1727204147.66022: done checking to see if all hosts have failed 16142 1727204147.66023: getting the remaining hosts for this loop 16142 1727204147.66024: done getting the remaining hosts for this loop 16142 1727204147.66028: getting the next task for host managed-node2 16142 1727204147.66035: done getting next task for host managed-node2 16142 1727204147.66039: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204147.66041: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204147.66068: getting variables 16142 1727204147.66070: in VariableManager get_vars() 16142 1727204147.66125: Calling all_inventory to load vars for managed-node2 16142 1727204147.66127: Calling groups_inventory to load vars for managed-node2 16142 1727204147.66129: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204147.66139: Calling all_plugins_play to load vars for managed-node2 16142 1727204147.66142: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204147.66145: Calling groups_plugins_play to load vars for managed-node2 16142 1727204147.68043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.69650: done with get_vars() 16142 1727204147.69680: done getting variables 16142 1727204147.69741: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.157) 0:00:46.874 ***** 16142 1727204147.69780: entering _queue_task() for managed-node2/package 16142 1727204147.70108: worker is 1 (out of 1 available) 16142 1727204147.70121: exiting _queue_task() for managed-node2/package 16142 1727204147.70133: done queuing things up, now waiting for results queue to drain 16142 1727204147.70135: waiting for pending results... 16142 1727204147.70432: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204147.70583: in run() - task 0affcd87-79f5-fddd-f6c7-000000000124 16142 1727204147.70603: variable 'ansible_search_path' from source: unknown 16142 1727204147.70610: variable 'ansible_search_path' from source: unknown 16142 1727204147.70650: calling self._execute() 16142 1727204147.70751: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.70762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.70779: variable 'omit' from source: magic vars 16142 1727204147.71157: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.71176: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.71310: variable 'network_state' from source: role '' defaults 16142 1727204147.71328: Evaluated conditional (network_state != {}): False 16142 1727204147.71337: when evaluation is False, skipping this task 16142 1727204147.71347: _execute() done 16142 1727204147.71355: dumping result to json 16142 1727204147.71362: done dumping result, returning 16142 1727204147.71376: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-000000000124] 16142 1727204147.71387: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000124 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204147.71554: no more pending results, returning what we have 16142 1727204147.71559: results queue empty 16142 1727204147.71560: checking for any_errors_fatal 16142 1727204147.71569: done checking for any_errors_fatal 16142 1727204147.71570: checking for max_fail_percentage 16142 1727204147.71573: done checking for max_fail_percentage 16142 1727204147.71574: checking to see if all hosts have failed and the running result is not ok 16142 1727204147.71575: done checking to see if all hosts have failed 16142 1727204147.71576: getting the remaining hosts for this loop 16142 1727204147.71577: done getting the remaining hosts for this loop 16142 1727204147.71582: getting the next task for host managed-node2 16142 1727204147.71590: done getting next task for host managed-node2 16142 1727204147.71595: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204147.71598: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204147.71623: getting variables 16142 1727204147.71625: in VariableManager get_vars() 16142 1727204147.71688: Calling all_inventory to load vars for managed-node2 16142 1727204147.71691: Calling groups_inventory to load vars for managed-node2 16142 1727204147.71693: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204147.71706: Calling all_plugins_play to load vars for managed-node2 16142 1727204147.71709: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204147.71712: Calling groups_plugins_play to load vars for managed-node2 16142 1727204147.72684: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000124 16142 1727204147.72687: WORKER PROCESS EXITING 16142 1727204147.73440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.75123: done with get_vars() 16142 1727204147.75150: done getting variables 16142 1727204147.75215: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.054) 0:00:46.929 ***** 16142 1727204147.75252: entering _queue_task() for managed-node2/package 16142 1727204147.75587: worker is 1 (out of 1 available) 16142 1727204147.75601: exiting _queue_task() for managed-node2/package 16142 1727204147.75615: done queuing things up, now waiting for results queue to drain 16142 1727204147.75616: waiting for pending results... 16142 1727204147.75927: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204147.76078: in run() - task 0affcd87-79f5-fddd-f6c7-000000000125 16142 1727204147.76099: variable 'ansible_search_path' from source: unknown 16142 1727204147.76108: variable 'ansible_search_path' from source: unknown 16142 1727204147.76150: calling self._execute() 16142 1727204147.76256: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.76270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.76288: variable 'omit' from source: magic vars 16142 1727204147.76659: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.76678: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.76801: variable 'network_state' from source: role '' defaults 16142 1727204147.76820: Evaluated conditional (network_state != {}): False 16142 1727204147.76828: when evaluation is False, skipping this task 16142 1727204147.76835: _execute() done 16142 1727204147.76842: dumping result to json 16142 1727204147.76849: done dumping result, returning 16142 1727204147.76860: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-000000000125] 16142 1727204147.76876: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000125 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204147.77029: no more pending results, returning what we have 16142 1727204147.77033: results queue empty 16142 1727204147.77035: checking for any_errors_fatal 16142 1727204147.77041: done checking for any_errors_fatal 16142 1727204147.77042: checking for max_fail_percentage 16142 1727204147.77044: done checking for max_fail_percentage 16142 1727204147.77045: checking to see if all hosts have failed and the running result is not ok 16142 1727204147.77046: done checking to see if all hosts have failed 16142 1727204147.77047: getting the remaining hosts for this loop 16142 1727204147.77048: done getting the remaining hosts for this loop 16142 1727204147.77052: getting the next task for host managed-node2 16142 1727204147.77060: done getting next task for host managed-node2 16142 1727204147.77068: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204147.77071: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204147.77093: getting variables 16142 1727204147.77095: in VariableManager get_vars() 16142 1727204147.77150: Calling all_inventory to load vars for managed-node2 16142 1727204147.77154: Calling groups_inventory to load vars for managed-node2 16142 1727204147.77156: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204147.77171: Calling all_plugins_play to load vars for managed-node2 16142 1727204147.77174: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204147.77178: Calling groups_plugins_play to load vars for managed-node2 16142 1727204147.78185: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000125 16142 1727204147.78189: WORKER PROCESS EXITING 16142 1727204147.79067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.80678: done with get_vars() 16142 1727204147.80708: done getting variables 16142 1727204147.80767: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.055) 0:00:46.984 ***** 16142 1727204147.80803: entering _queue_task() for managed-node2/service 16142 1727204147.81133: worker is 1 (out of 1 available) 16142 1727204147.81147: exiting _queue_task() for managed-node2/service 16142 1727204147.81159: done queuing things up, now waiting for results queue to drain 16142 1727204147.81161: waiting for pending results... 16142 1727204147.81471: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204147.81625: in run() - task 0affcd87-79f5-fddd-f6c7-000000000126 16142 1727204147.81646: variable 'ansible_search_path' from source: unknown 16142 1727204147.81655: variable 'ansible_search_path' from source: unknown 16142 1727204147.81698: calling self._execute() 16142 1727204147.81805: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.81823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.81838: variable 'omit' from source: magic vars 16142 1727204147.82222: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.82241: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.82375: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204147.82587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204147.84995: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204147.85090: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204147.85137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204147.85185: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204147.85220: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204147.85314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.85351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.85386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.85440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.85459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.85518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.85549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.85581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.85627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.85645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.85690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204147.85718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204147.85748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.85792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204147.85811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204147.85988: variable 'network_connections' from source: task vars 16142 1727204147.86005: variable 'port1_profile' from source: play vars 16142 1727204147.86083: variable 'port1_profile' from source: play vars 16142 1727204147.86098: variable 'port2_profile' from source: play vars 16142 1727204147.86162: variable 'port2_profile' from source: play vars 16142 1727204147.86233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204147.86407: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204147.86448: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204147.86485: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204147.86518: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204147.86563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204147.86592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204147.86623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204147.86652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204147.86711: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204147.86939: variable 'network_connections' from source: task vars 16142 1727204147.86949: variable 'port1_profile' from source: play vars 16142 1727204147.87013: variable 'port1_profile' from source: play vars 16142 1727204147.87025: variable 'port2_profile' from source: play vars 16142 1727204147.87093: variable 'port2_profile' from source: play vars 16142 1727204147.87122: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204147.87130: when evaluation is False, skipping this task 16142 1727204147.87136: _execute() done 16142 1727204147.87147: dumping result to json 16142 1727204147.87155: done dumping result, returning 16142 1727204147.87168: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000126] 16142 1727204147.87187: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000126 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204147.87343: no more pending results, returning what we have 16142 1727204147.87347: results queue empty 16142 1727204147.87348: checking for any_errors_fatal 16142 1727204147.87355: done checking for any_errors_fatal 16142 1727204147.87356: checking for max_fail_percentage 16142 1727204147.87359: done checking for max_fail_percentage 16142 1727204147.87360: checking to see if all hosts have failed and the running result is not ok 16142 1727204147.87360: done checking to see if all hosts have failed 16142 1727204147.87361: getting the remaining hosts for this loop 16142 1727204147.87363: done getting the remaining hosts for this loop 16142 1727204147.87368: getting the next task for host managed-node2 16142 1727204147.87376: done getting next task for host managed-node2 16142 1727204147.87381: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204147.87384: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204147.87406: getting variables 16142 1727204147.87408: in VariableManager get_vars() 16142 1727204147.87463: Calling all_inventory to load vars for managed-node2 16142 1727204147.87468: Calling groups_inventory to load vars for managed-node2 16142 1727204147.87471: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204147.87481: Calling all_plugins_play to load vars for managed-node2 16142 1727204147.87484: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204147.87487: Calling groups_plugins_play to load vars for managed-node2 16142 1727204147.88484: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000126 16142 1727204147.88488: WORKER PROCESS EXITING 16142 1727204147.89229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204147.91687: done with get_vars() 16142 1727204147.91722: done getting variables 16142 1727204147.91788: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.110) 0:00:47.095 ***** 16142 1727204147.91823: entering _queue_task() for managed-node2/service 16142 1727204147.92182: worker is 1 (out of 1 available) 16142 1727204147.92196: exiting _queue_task() for managed-node2/service 16142 1727204147.92210: done queuing things up, now waiting for results queue to drain 16142 1727204147.92211: waiting for pending results... 16142 1727204147.92511: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204147.92651: in run() - task 0affcd87-79f5-fddd-f6c7-000000000127 16142 1727204147.92679: variable 'ansible_search_path' from source: unknown 16142 1727204147.92687: variable 'ansible_search_path' from source: unknown 16142 1727204147.92728: calling self._execute() 16142 1727204147.92833: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204147.92843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204147.92857: variable 'omit' from source: magic vars 16142 1727204147.93319: variable 'ansible_distribution_major_version' from source: facts 16142 1727204147.93373: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204147.93576: variable 'network_provider' from source: set_fact 16142 1727204147.93586: variable 'network_state' from source: role '' defaults 16142 1727204147.93600: Evaluated conditional (network_provider == "nm" or network_state != {}): True 16142 1727204147.93611: variable 'omit' from source: magic vars 16142 1727204147.93676: variable 'omit' from source: magic vars 16142 1727204147.93714: variable 'network_service_name' from source: role '' defaults 16142 1727204147.93786: variable 'network_service_name' from source: role '' defaults 16142 1727204147.93893: variable '__network_provider_setup' from source: role '' defaults 16142 1727204147.93910: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204147.93978: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204147.93993: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204147.94064: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204147.94303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204148.00332: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204148.00480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204148.00639: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204148.00670: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204148.00694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204148.00855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204148.00889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204148.00912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.00952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204148.00966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204148.01009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204148.01030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204148.01054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.01092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204148.01105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204148.01341: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204148.01455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204148.01659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204148.01685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.01723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204148.01739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204148.01838: variable 'ansible_python' from source: facts 16142 1727204148.01859: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204148.01944: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204148.02028: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204148.02150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204148.02176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204148.02200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.02241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204148.02253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204148.02524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204148.02549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204148.02576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.02615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204148.02628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204148.02769: variable 'network_connections' from source: task vars 16142 1727204148.02780: variable 'port1_profile' from source: play vars 16142 1727204148.02854: variable 'port1_profile' from source: play vars 16142 1727204148.02868: variable 'port2_profile' from source: play vars 16142 1727204148.02940: variable 'port2_profile' from source: play vars 16142 1727204148.03197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204148.03397: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204148.03445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204148.03489: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204148.03529: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204148.03594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204148.03624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204148.03656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.03691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204148.03739: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204148.04017: variable 'network_connections' from source: task vars 16142 1727204148.04024: variable 'port1_profile' from source: play vars 16142 1727204148.04097: variable 'port1_profile' from source: play vars 16142 1727204148.04109: variable 'port2_profile' from source: play vars 16142 1727204148.04329: variable 'port2_profile' from source: play vars 16142 1727204148.04360: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204148.04441: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204148.04742: variable 'network_connections' from source: task vars 16142 1727204148.04745: variable 'port1_profile' from source: play vars 16142 1727204148.04818: variable 'port1_profile' from source: play vars 16142 1727204148.04826: variable 'port2_profile' from source: play vars 16142 1727204148.04917: variable 'port2_profile' from source: play vars 16142 1727204148.04920: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204148.05023: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204148.05353: variable 'network_connections' from source: task vars 16142 1727204148.05357: variable 'port1_profile' from source: play vars 16142 1727204148.05429: variable 'port1_profile' from source: play vars 16142 1727204148.05438: variable 'port2_profile' from source: play vars 16142 1727204148.05507: variable 'port2_profile' from source: play vars 16142 1727204148.05560: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204148.05622: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204148.05628: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204148.05690: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204148.05905: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204148.06647: variable 'network_connections' from source: task vars 16142 1727204148.06650: variable 'port1_profile' from source: play vars 16142 1727204148.06652: variable 'port1_profile' from source: play vars 16142 1727204148.06654: variable 'port2_profile' from source: play vars 16142 1727204148.06677: variable 'port2_profile' from source: play vars 16142 1727204148.06684: variable 'ansible_distribution' from source: facts 16142 1727204148.06687: variable '__network_rh_distros' from source: role '' defaults 16142 1727204148.06693: variable 'ansible_distribution_major_version' from source: facts 16142 1727204148.06707: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204148.06881: variable 'ansible_distribution' from source: facts 16142 1727204148.06884: variable '__network_rh_distros' from source: role '' defaults 16142 1727204148.06887: variable 'ansible_distribution_major_version' from source: facts 16142 1727204148.06899: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204148.07163: variable 'ansible_distribution' from source: facts 16142 1727204148.07169: variable '__network_rh_distros' from source: role '' defaults 16142 1727204148.07478: variable 'ansible_distribution_major_version' from source: facts 16142 1727204148.07516: variable 'network_provider' from source: set_fact 16142 1727204148.07541: variable 'omit' from source: magic vars 16142 1727204148.07568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204148.07597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204148.07616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204148.07632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204148.07641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204148.07673: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204148.07676: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204148.07678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204148.07778: Set connection var ansible_timeout to 10 16142 1727204148.07781: Set connection var ansible_connection to ssh 16142 1727204148.07783: Set connection var ansible_shell_type to sh 16142 1727204148.07790: Set connection var ansible_shell_executable to /bin/sh 16142 1727204148.07795: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204148.07803: Set connection var ansible_pipelining to False 16142 1727204148.07827: variable 'ansible_shell_executable' from source: unknown 16142 1727204148.07831: variable 'ansible_connection' from source: unknown 16142 1727204148.07833: variable 'ansible_module_compression' from source: unknown 16142 1727204148.07838: variable 'ansible_shell_type' from source: unknown 16142 1727204148.07845: variable 'ansible_shell_executable' from source: unknown 16142 1727204148.07847: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204148.07849: variable 'ansible_pipelining' from source: unknown 16142 1727204148.07851: variable 'ansible_timeout' from source: unknown 16142 1727204148.07855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204148.08214: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204148.08224: variable 'omit' from source: magic vars 16142 1727204148.08230: starting attempt loop 16142 1727204148.08234: running the handler 16142 1727204148.08312: variable 'ansible_facts' from source: unknown 16142 1727204148.09081: _low_level_execute_command(): starting 16142 1727204148.09087: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204148.09823: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204148.09834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.09845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.09859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.09900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.09907: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204148.09918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.09930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204148.09939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204148.09942: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204148.09950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.09959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.09975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.10297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.10304: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204148.10313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.10388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204148.10406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204148.10419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204148.10489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204148.12158: stdout chunk (state=3): >>>/root <<< 16142 1727204148.12341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204148.12345: stdout chunk (state=3): >>><<< 16142 1727204148.12352: stderr chunk (state=3): >>><<< 16142 1727204148.12373: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204148.12386: _low_level_execute_command(): starting 16142 1727204148.12393: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637 `" && echo ansible-tmp-1727204148.1237454-19590-199070008498637="` echo /root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637 `" ) && sleep 0' 16142 1727204148.13066: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204148.13075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.13086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.13099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.13142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.13149: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204148.13159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.13175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204148.13182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204148.13188: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204148.13195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.13204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.13216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.13224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.13232: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204148.13246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.13320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204148.13333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204148.13343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204148.13404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204148.15252: stdout chunk (state=3): >>>ansible-tmp-1727204148.1237454-19590-199070008498637=/root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637 <<< 16142 1727204148.15437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204148.15442: stdout chunk (state=3): >>><<< 16142 1727204148.15444: stderr chunk (state=3): >>><<< 16142 1727204148.15462: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204148.1237454-19590-199070008498637=/root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204148.15497: variable 'ansible_module_compression' from source: unknown 16142 1727204148.15552: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 16142 1727204148.15609: variable 'ansible_facts' from source: unknown 16142 1727204148.15884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637/AnsiballZ_systemd.py 16142 1727204148.16211: Sending initial data 16142 1727204148.16214: Sent initial data (156 bytes) 16142 1727204148.19095: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204148.19107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.19123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.19140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.19255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.19273: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204148.19290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.19310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204148.19323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204148.19340: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204148.19357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.19375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.19392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.19405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.19418: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204148.19432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.19531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204148.19589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204148.19603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204148.19668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204148.21404: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204148.21439: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204148.21478: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmphw493sgm /root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637/AnsiballZ_systemd.py <<< 16142 1727204148.21532: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204148.24600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204148.24682: stderr chunk (state=3): >>><<< 16142 1727204148.24686: stdout chunk (state=3): >>><<< 16142 1727204148.24706: done transferring module to remote 16142 1727204148.24720: _low_level_execute_command(): starting 16142 1727204148.24725: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637/ /root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637/AnsiballZ_systemd.py && sleep 0' 16142 1727204148.26353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204148.26357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.26370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.26385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.26427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.26438: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204148.26448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.26467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204148.26476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204148.26483: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204148.26491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.26500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.26511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.26519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.26525: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204148.26539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.26718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204148.26732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204148.26744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204148.26881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204148.28682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204148.28732: stderr chunk (state=3): >>><<< 16142 1727204148.28738: stdout chunk (state=3): >>><<< 16142 1727204148.28750: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204148.28758: _low_level_execute_command(): starting 16142 1727204148.28761: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637/AnsiballZ_systemd.py && sleep 0' 16142 1727204148.30343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204148.30477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.30487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.30501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.30560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.30571: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204148.30585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.30598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204148.30605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204148.30611: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204148.30619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204148.30692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.30703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.30711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204148.30718: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204148.30727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.30915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204148.30930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204148.30933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204148.31126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204148.56259: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 16142 1727204148.56316: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6905856", "MemoryAvailable": "infinity", "CPUUsageNSec": "1119708000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 16142 1727204148.56354: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 16142 1727204148.57889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204148.57894: stdout chunk (state=3): >>><<< 16142 1727204148.57898: stderr chunk (state=3): >>><<< 16142 1727204148.57917: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6905856", "MemoryAvailable": "infinity", "CPUUsageNSec": "1119708000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204148.58096: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204148.58114: _low_level_execute_command(): starting 16142 1727204148.58117: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204148.1237454-19590-199070008498637/ > /dev/null 2>&1 && sleep 0' 16142 1727204148.59319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.59324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.59375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204148.59379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.59393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204148.59398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204148.59404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204148.59415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204148.59421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204148.59502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204148.59516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204148.59522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204148.59594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204148.61459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204148.61468: stderr chunk (state=3): >>><<< 16142 1727204148.61472: stdout chunk (state=3): >>><<< 16142 1727204148.61773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204148.61776: handler run complete 16142 1727204148.61778: attempt loop complete, returning result 16142 1727204148.61780: _execute() done 16142 1727204148.61781: dumping result to json 16142 1727204148.61783: done dumping result, returning 16142 1727204148.61785: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-fddd-f6c7-000000000127] 16142 1727204148.61786: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000127 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204148.62462: no more pending results, returning what we have 16142 1727204148.62468: results queue empty 16142 1727204148.62469: checking for any_errors_fatal 16142 1727204148.62472: done checking for any_errors_fatal 16142 1727204148.62473: checking for max_fail_percentage 16142 1727204148.62474: done checking for max_fail_percentage 16142 1727204148.62475: checking to see if all hosts have failed and the running result is not ok 16142 1727204148.62476: done checking to see if all hosts have failed 16142 1727204148.62477: getting the remaining hosts for this loop 16142 1727204148.62478: done getting the remaining hosts for this loop 16142 1727204148.62481: getting the next task for host managed-node2 16142 1727204148.62486: done getting next task for host managed-node2 16142 1727204148.62491: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204148.62493: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204148.62503: getting variables 16142 1727204148.62505: in VariableManager get_vars() 16142 1727204148.62556: Calling all_inventory to load vars for managed-node2 16142 1727204148.62559: Calling groups_inventory to load vars for managed-node2 16142 1727204148.62561: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204148.62575: Calling all_plugins_play to load vars for managed-node2 16142 1727204148.62578: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204148.62581: Calling groups_plugins_play to load vars for managed-node2 16142 1727204148.63447: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000127 16142 1727204148.63451: WORKER PROCESS EXITING 16142 1727204148.65526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204148.69134: done with get_vars() 16142 1727204148.69203: done getting variables 16142 1727204148.69328: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.775) 0:00:47.870 ***** 16142 1727204148.69366: entering _queue_task() for managed-node2/service 16142 1727204148.69846: worker is 1 (out of 1 available) 16142 1727204148.69858: exiting _queue_task() for managed-node2/service 16142 1727204148.69873: done queuing things up, now waiting for results queue to drain 16142 1727204148.69875: waiting for pending results... 16142 1727204148.70847: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204148.71018: in run() - task 0affcd87-79f5-fddd-f6c7-000000000128 16142 1727204148.71042: variable 'ansible_search_path' from source: unknown 16142 1727204148.71050: variable 'ansible_search_path' from source: unknown 16142 1727204148.71095: calling self._execute() 16142 1727204148.71206: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204148.71218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204148.71241: variable 'omit' from source: magic vars 16142 1727204148.71662: variable 'ansible_distribution_major_version' from source: facts 16142 1727204148.71685: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204148.71814: variable 'network_provider' from source: set_fact 16142 1727204148.71826: Evaluated conditional (network_provider == "nm"): True 16142 1727204148.71929: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204148.72028: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204148.72221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204148.75093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204148.75187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204148.75237: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204148.75283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204148.75314: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204148.75591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204148.75626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204148.75657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.75712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204148.75733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204148.75792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204148.75820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204148.75849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.75899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204148.75917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204148.75961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204148.75992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204148.76024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.76070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204148.76090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204148.76254: variable 'network_connections' from source: task vars 16142 1727204148.76275: variable 'port1_profile' from source: play vars 16142 1727204148.76354: variable 'port1_profile' from source: play vars 16142 1727204148.76371: variable 'port2_profile' from source: play vars 16142 1727204148.76442: variable 'port2_profile' from source: play vars 16142 1727204148.76519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204148.76710: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204148.76755: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204148.76819: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204148.76852: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204148.76903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204148.76927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204148.76955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204148.76988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204148.77049: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204148.77330: variable 'network_connections' from source: task vars 16142 1727204148.77340: variable 'port1_profile' from source: play vars 16142 1727204148.77410: variable 'port1_profile' from source: play vars 16142 1727204148.77424: variable 'port2_profile' from source: play vars 16142 1727204148.77487: variable 'port2_profile' from source: play vars 16142 1727204148.77522: Evaluated conditional (__network_wpa_supplicant_required): False 16142 1727204148.77533: when evaluation is False, skipping this task 16142 1727204148.77550: _execute() done 16142 1727204148.77557: dumping result to json 16142 1727204148.77565: done dumping result, returning 16142 1727204148.77577: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-fddd-f6c7-000000000128] 16142 1727204148.77586: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000128 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 16142 1727204148.77736: no more pending results, returning what we have 16142 1727204148.77741: results queue empty 16142 1727204148.77742: checking for any_errors_fatal 16142 1727204148.77765: done checking for any_errors_fatal 16142 1727204148.77766: checking for max_fail_percentage 16142 1727204148.77768: done checking for max_fail_percentage 16142 1727204148.77769: checking to see if all hosts have failed and the running result is not ok 16142 1727204148.77770: done checking to see if all hosts have failed 16142 1727204148.77771: getting the remaining hosts for this loop 16142 1727204148.77772: done getting the remaining hosts for this loop 16142 1727204148.77776: getting the next task for host managed-node2 16142 1727204148.77783: done getting next task for host managed-node2 16142 1727204148.77787: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204148.77790: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204148.77811: getting variables 16142 1727204148.77813: in VariableManager get_vars() 16142 1727204148.77871: Calling all_inventory to load vars for managed-node2 16142 1727204148.77874: Calling groups_inventory to load vars for managed-node2 16142 1727204148.77877: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204148.77887: Calling all_plugins_play to load vars for managed-node2 16142 1727204148.77890: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204148.77893: Calling groups_plugins_play to load vars for managed-node2 16142 1727204148.78884: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000128 16142 1727204148.78888: WORKER PROCESS EXITING 16142 1727204148.79752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204148.81524: done with get_vars() 16142 1727204148.81570: done getting variables 16142 1727204148.81643: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.123) 0:00:47.993 ***** 16142 1727204148.81678: entering _queue_task() for managed-node2/service 16142 1727204148.82032: worker is 1 (out of 1 available) 16142 1727204148.82044: exiting _queue_task() for managed-node2/service 16142 1727204148.82056: done queuing things up, now waiting for results queue to drain 16142 1727204148.82057: waiting for pending results... 16142 1727204148.82388: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204148.82539: in run() - task 0affcd87-79f5-fddd-f6c7-000000000129 16142 1727204148.82558: variable 'ansible_search_path' from source: unknown 16142 1727204148.82568: variable 'ansible_search_path' from source: unknown 16142 1727204148.82620: calling self._execute() 16142 1727204148.83533: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204148.83610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204148.83628: variable 'omit' from source: magic vars 16142 1727204148.84523: variable 'ansible_distribution_major_version' from source: facts 16142 1727204148.84590: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204148.84934: variable 'network_provider' from source: set_fact 16142 1727204148.84948: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204148.84982: when evaluation is False, skipping this task 16142 1727204148.84991: _execute() done 16142 1727204148.84999: dumping result to json 16142 1727204148.85012: done dumping result, returning 16142 1727204148.85082: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-fddd-f6c7-000000000129] 16142 1727204148.85094: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000129 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204148.85285: no more pending results, returning what we have 16142 1727204148.85289: results queue empty 16142 1727204148.85291: checking for any_errors_fatal 16142 1727204148.85299: done checking for any_errors_fatal 16142 1727204148.85300: checking for max_fail_percentage 16142 1727204148.85309: done checking for max_fail_percentage 16142 1727204148.85310: checking to see if all hosts have failed and the running result is not ok 16142 1727204148.85311: done checking to see if all hosts have failed 16142 1727204148.85312: getting the remaining hosts for this loop 16142 1727204148.85316: done getting the remaining hosts for this loop 16142 1727204148.85321: getting the next task for host managed-node2 16142 1727204148.85336: done getting next task for host managed-node2 16142 1727204148.85342: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204148.85345: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204148.85371: getting variables 16142 1727204148.85374: in VariableManager get_vars() 16142 1727204148.85435: Calling all_inventory to load vars for managed-node2 16142 1727204148.85438: Calling groups_inventory to load vars for managed-node2 16142 1727204148.85440: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204148.85452: Calling all_plugins_play to load vars for managed-node2 16142 1727204148.85455: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204148.85458: Calling groups_plugins_play to load vars for managed-node2 16142 1727204148.86643: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000129 16142 1727204148.86647: WORKER PROCESS EXITING 16142 1727204148.88172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204148.90309: done with get_vars() 16142 1727204148.90343: done getting variables 16142 1727204148.90407: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.087) 0:00:48.081 ***** 16142 1727204148.90445: entering _queue_task() for managed-node2/copy 16142 1727204148.90800: worker is 1 (out of 1 available) 16142 1727204148.90812: exiting _queue_task() for managed-node2/copy 16142 1727204148.90827: done queuing things up, now waiting for results queue to drain 16142 1727204148.90828: waiting for pending results... 16142 1727204148.91153: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204148.91290: in run() - task 0affcd87-79f5-fddd-f6c7-00000000012a 16142 1727204148.91309: variable 'ansible_search_path' from source: unknown 16142 1727204148.91317: variable 'ansible_search_path' from source: unknown 16142 1727204148.91382: calling self._execute() 16142 1727204148.91490: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204148.91503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204148.91518: variable 'omit' from source: magic vars 16142 1727204148.91936: variable 'ansible_distribution_major_version' from source: facts 16142 1727204148.91956: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204148.92112: variable 'network_provider' from source: set_fact 16142 1727204148.92125: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204148.92132: when evaluation is False, skipping this task 16142 1727204148.92141: _execute() done 16142 1727204148.92153: dumping result to json 16142 1727204148.92161: done dumping result, returning 16142 1727204148.92177: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-fddd-f6c7-00000000012a] 16142 1727204148.92188: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012a skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204148.92351: no more pending results, returning what we have 16142 1727204148.92356: results queue empty 16142 1727204148.92358: checking for any_errors_fatal 16142 1727204148.92362: done checking for any_errors_fatal 16142 1727204148.92365: checking for max_fail_percentage 16142 1727204148.92368: done checking for max_fail_percentage 16142 1727204148.92369: checking to see if all hosts have failed and the running result is not ok 16142 1727204148.92370: done checking to see if all hosts have failed 16142 1727204148.92370: getting the remaining hosts for this loop 16142 1727204148.92372: done getting the remaining hosts for this loop 16142 1727204148.92376: getting the next task for host managed-node2 16142 1727204148.92383: done getting next task for host managed-node2 16142 1727204148.92388: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204148.92391: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204148.92416: getting variables 16142 1727204148.92419: in VariableManager get_vars() 16142 1727204148.92479: Calling all_inventory to load vars for managed-node2 16142 1727204148.92482: Calling groups_inventory to load vars for managed-node2 16142 1727204148.92485: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204148.92497: Calling all_plugins_play to load vars for managed-node2 16142 1727204148.92501: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204148.92504: Calling groups_plugins_play to load vars for managed-node2 16142 1727204148.94187: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012a 16142 1727204148.94191: WORKER PROCESS EXITING 16142 1727204148.94708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204148.96503: done with get_vars() 16142 1727204148.96743: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.063) 0:00:48.145 ***** 16142 1727204148.96845: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204148.97174: worker is 1 (out of 1 available) 16142 1727204148.97187: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204148.97200: done queuing things up, now waiting for results queue to drain 16142 1727204148.97201: waiting for pending results... 16142 1727204148.97608: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204148.97767: in run() - task 0affcd87-79f5-fddd-f6c7-00000000012b 16142 1727204148.97790: variable 'ansible_search_path' from source: unknown 16142 1727204148.97798: variable 'ansible_search_path' from source: unknown 16142 1727204148.97842: calling self._execute() 16142 1727204148.97949: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204148.97961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204148.97982: variable 'omit' from source: magic vars 16142 1727204148.98402: variable 'ansible_distribution_major_version' from source: facts 16142 1727204148.98424: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204148.98439: variable 'omit' from source: magic vars 16142 1727204148.98501: variable 'omit' from source: magic vars 16142 1727204148.98749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204149.01445: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204149.01526: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204149.01575: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204149.01619: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204149.01651: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204149.01755: variable 'network_provider' from source: set_fact 16142 1727204149.01897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204149.01951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204149.01987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204149.02038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204149.02057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204149.02145: variable 'omit' from source: magic vars 16142 1727204149.02359: variable 'omit' from source: magic vars 16142 1727204149.02477: variable 'network_connections' from source: task vars 16142 1727204149.02495: variable 'port1_profile' from source: play vars 16142 1727204149.02563: variable 'port1_profile' from source: play vars 16142 1727204149.02581: variable 'port2_profile' from source: play vars 16142 1727204149.02644: variable 'port2_profile' from source: play vars 16142 1727204149.02813: variable 'omit' from source: magic vars 16142 1727204149.02909: variable '__lsr_ansible_managed' from source: task vars 16142 1727204149.02973: variable '__lsr_ansible_managed' from source: task vars 16142 1727204149.03208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 16142 1727204149.03456: Loaded config def from plugin (lookup/template) 16142 1727204149.03469: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 16142 1727204149.03502: File lookup term: get_ansible_managed.j2 16142 1727204149.03511: variable 'ansible_search_path' from source: unknown 16142 1727204149.03520: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 16142 1727204149.03538: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 16142 1727204149.03566: variable 'ansible_search_path' from source: unknown 16142 1727204149.11333: variable 'ansible_managed' from source: unknown 16142 1727204149.11501: variable 'omit' from source: magic vars 16142 1727204149.11535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204149.11570: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204149.11596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204149.11625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204149.11639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204149.11744: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204149.11752: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204149.11759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204149.11984: Set connection var ansible_timeout to 10 16142 1727204149.11992: Set connection var ansible_connection to ssh 16142 1727204149.12001: Set connection var ansible_shell_type to sh 16142 1727204149.12009: Set connection var ansible_shell_executable to /bin/sh 16142 1727204149.12019: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204149.12029: Set connection var ansible_pipelining to False 16142 1727204149.12060: variable 'ansible_shell_executable' from source: unknown 16142 1727204149.12071: variable 'ansible_connection' from source: unknown 16142 1727204149.12077: variable 'ansible_module_compression' from source: unknown 16142 1727204149.12083: variable 'ansible_shell_type' from source: unknown 16142 1727204149.12088: variable 'ansible_shell_executable' from source: unknown 16142 1727204149.12093: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204149.12100: variable 'ansible_pipelining' from source: unknown 16142 1727204149.12105: variable 'ansible_timeout' from source: unknown 16142 1727204149.12111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204149.12246: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204149.12278: variable 'omit' from source: magic vars 16142 1727204149.12289: starting attempt loop 16142 1727204149.12295: running the handler 16142 1727204149.12311: _low_level_execute_command(): starting 16142 1727204149.12322: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204149.13180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204149.13201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.13218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.13238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.13284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204149.13301: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204149.13315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.13333: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204149.13346: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204149.13358: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204149.13373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.13387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.13402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.13418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204149.13430: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204149.13443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.13521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204149.13546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204149.13567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204149.13647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204149.15297: stdout chunk (state=3): >>>/root <<< 16142 1727204149.15498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204149.15502: stdout chunk (state=3): >>><<< 16142 1727204149.15504: stderr chunk (state=3): >>><<< 16142 1727204149.15618: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204149.15622: _low_level_execute_command(): starting 16142 1727204149.15626: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627 `" && echo ansible-tmp-1727204149.1552398-19716-112606952596627="` echo /root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627 `" ) && sleep 0' 16142 1727204149.16434: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204149.16450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.16469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.16491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.16535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204149.16549: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204149.16566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.16586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204149.16602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204149.16615: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204149.16627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.16641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.16657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.16673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204149.16685: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204149.16700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.16779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204149.16802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204149.16825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204149.16898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204149.18784: stdout chunk (state=3): >>>ansible-tmp-1727204149.1552398-19716-112606952596627=/root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627 <<< 16142 1727204149.18991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204149.18995: stdout chunk (state=3): >>><<< 16142 1727204149.18997: stderr chunk (state=3): >>><<< 16142 1727204149.19371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204149.1552398-19716-112606952596627=/root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204149.19378: variable 'ansible_module_compression' from source: unknown 16142 1727204149.19382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 16142 1727204149.19385: variable 'ansible_facts' from source: unknown 16142 1727204149.19388: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627/AnsiballZ_network_connections.py 16142 1727204149.19446: Sending initial data 16142 1727204149.19450: Sent initial data (168 bytes) 16142 1727204149.20846: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.20852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.20901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.20905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.20921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.20926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.21020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204149.21041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204149.21191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204149.22861: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204149.22899: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204149.22942: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpsx50nc0a /root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627/AnsiballZ_network_connections.py <<< 16142 1727204149.22978: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204149.25066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204149.25198: stderr chunk (state=3): >>><<< 16142 1727204149.25201: stdout chunk (state=3): >>><<< 16142 1727204149.25204: done transferring module to remote 16142 1727204149.25206: _low_level_execute_command(): starting 16142 1727204149.25208: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627/ /root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627/AnsiballZ_network_connections.py && sleep 0' 16142 1727204149.26777: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.26781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.26873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204149.26877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.26880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.27058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204149.27078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204149.27090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204149.27267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204149.28997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204149.29080: stderr chunk (state=3): >>><<< 16142 1727204149.29084: stdout chunk (state=3): >>><<< 16142 1727204149.29186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204149.29190: _low_level_execute_command(): starting 16142 1727204149.29192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627/AnsiballZ_network_connections.py && sleep 0' 16142 1727204149.30706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.30711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.30870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.30874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.30883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.31032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204149.31090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204149.31256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204149.74051: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 16142 1727204149.74085: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fae6ho3m/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 16142 1727204149.74102: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fae6ho3m/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/72f186f1-b611-4f2e-9d00-de0c3bf7aa23: error=unknown <<< 16142 1727204149.76322: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 16142 1727204149.76443: stdout chunk (state=3): >>> <<< 16142 1727204149.76447: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fae6ho3m/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 16142 1727204149.76473: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fae6ho3m/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 16142 1727204149.76506: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/9eebdc14-bdb2-41b3-94db-1a5b2e988b68: error=unknown <<< 16142 1727204149.76710: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 16142 1727204149.78383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204149.78485: stderr chunk (state=3): >>><<< 16142 1727204149.78488: stdout chunk (state=3): >>><<< 16142 1727204149.78511: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fae6ho3m/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fae6ho3m/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/72f186f1-b611-4f2e-9d00-de0c3bf7aa23: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fae6ho3m/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fae6ho3m/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/9eebdc14-bdb2-41b3-94db-1a5b2e988b68: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204149.78557: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204149.78567: _low_level_execute_command(): starting 16142 1727204149.78574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204149.1552398-19716-112606952596627/ > /dev/null 2>&1 && sleep 0' 16142 1727204149.79285: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204149.79293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.79313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.79327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.79369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204149.79378: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204149.79388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.79401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204149.79418: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204149.79424: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204149.79432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204149.79445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204149.79456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204149.79465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204149.79473: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204149.79482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204149.79834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204149.79850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204149.79861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204149.79929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204149.81818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204149.81822: stdout chunk (state=3): >>><<< 16142 1727204149.81828: stderr chunk (state=3): >>><<< 16142 1727204149.81858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204149.81865: handler run complete 16142 1727204149.81899: attempt loop complete, returning result 16142 1727204149.81903: _execute() done 16142 1727204149.81905: dumping result to json 16142 1727204149.81910: done dumping result, returning 16142 1727204149.81921: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-fddd-f6c7-00000000012b] 16142 1727204149.81926: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012b 16142 1727204149.82048: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012b 16142 1727204149.82051: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 16142 1727204149.82196: no more pending results, returning what we have 16142 1727204149.82199: results queue empty 16142 1727204149.82201: checking for any_errors_fatal 16142 1727204149.82208: done checking for any_errors_fatal 16142 1727204149.82209: checking for max_fail_percentage 16142 1727204149.82211: done checking for max_fail_percentage 16142 1727204149.82211: checking to see if all hosts have failed and the running result is not ok 16142 1727204149.82212: done checking to see if all hosts have failed 16142 1727204149.82213: getting the remaining hosts for this loop 16142 1727204149.82214: done getting the remaining hosts for this loop 16142 1727204149.82218: getting the next task for host managed-node2 16142 1727204149.82225: done getting next task for host managed-node2 16142 1727204149.82229: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204149.82232: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204149.82247: getting variables 16142 1727204149.82249: in VariableManager get_vars() 16142 1727204149.82306: Calling all_inventory to load vars for managed-node2 16142 1727204149.82309: Calling groups_inventory to load vars for managed-node2 16142 1727204149.82312: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204149.82322: Calling all_plugins_play to load vars for managed-node2 16142 1727204149.82325: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204149.82328: Calling groups_plugins_play to load vars for managed-node2 16142 1727204149.87668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204149.90418: done with get_vars() 16142 1727204149.90446: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:49 -0400 (0:00:00.939) 0:00:49.084 ***** 16142 1727204149.90802: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204149.91790: worker is 1 (out of 1 available) 16142 1727204149.91802: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204149.91814: done queuing things up, now waiting for results queue to drain 16142 1727204149.91815: waiting for pending results... 16142 1727204149.92494: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204149.92653: in run() - task 0affcd87-79f5-fddd-f6c7-00000000012c 16142 1727204149.92673: variable 'ansible_search_path' from source: unknown 16142 1727204149.92677: variable 'ansible_search_path' from source: unknown 16142 1727204149.92713: calling self._execute() 16142 1727204149.92866: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204149.92876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204149.92888: variable 'omit' from source: magic vars 16142 1727204149.93504: variable 'ansible_distribution_major_version' from source: facts 16142 1727204149.93599: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204149.93784: variable 'network_state' from source: role '' defaults 16142 1727204149.93794: Evaluated conditional (network_state != {}): False 16142 1727204149.93800: when evaluation is False, skipping this task 16142 1727204149.93882: _execute() done 16142 1727204149.93885: dumping result to json 16142 1727204149.93890: done dumping result, returning 16142 1727204149.93898: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-fddd-f6c7-00000000012c] 16142 1727204149.93905: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012c 16142 1727204149.94002: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012c 16142 1727204149.94005: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204149.94065: no more pending results, returning what we have 16142 1727204149.94070: results queue empty 16142 1727204149.94073: checking for any_errors_fatal 16142 1727204149.94085: done checking for any_errors_fatal 16142 1727204149.94086: checking for max_fail_percentage 16142 1727204149.94089: done checking for max_fail_percentage 16142 1727204149.94090: checking to see if all hosts have failed and the running result is not ok 16142 1727204149.94091: done checking to see if all hosts have failed 16142 1727204149.94092: getting the remaining hosts for this loop 16142 1727204149.94093: done getting the remaining hosts for this loop 16142 1727204149.94097: getting the next task for host managed-node2 16142 1727204149.94104: done getting next task for host managed-node2 16142 1727204149.94109: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204149.94112: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204149.94138: getting variables 16142 1727204149.94140: in VariableManager get_vars() 16142 1727204149.94197: Calling all_inventory to load vars for managed-node2 16142 1727204149.94200: Calling groups_inventory to load vars for managed-node2 16142 1727204149.94203: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204149.94215: Calling all_plugins_play to load vars for managed-node2 16142 1727204149.94218: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204149.94221: Calling groups_plugins_play to load vars for managed-node2 16142 1727204149.96218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204149.98054: done with get_vars() 16142 1727204149.98095: done getting variables 16142 1727204149.98187: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:49 -0400 (0:00:00.074) 0:00:49.159 ***** 16142 1727204149.98235: entering _queue_task() for managed-node2/debug 16142 1727204149.98669: worker is 1 (out of 1 available) 16142 1727204149.98681: exiting _queue_task() for managed-node2/debug 16142 1727204149.98699: done queuing things up, now waiting for results queue to drain 16142 1727204149.98702: waiting for pending results... 16142 1727204149.99055: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204149.99221: in run() - task 0affcd87-79f5-fddd-f6c7-00000000012d 16142 1727204149.99245: variable 'ansible_search_path' from source: unknown 16142 1727204149.99254: variable 'ansible_search_path' from source: unknown 16142 1727204149.99300: calling self._execute() 16142 1727204149.99413: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204149.99424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204149.99441: variable 'omit' from source: magic vars 16142 1727204149.99846: variable 'ansible_distribution_major_version' from source: facts 16142 1727204149.99890: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204149.99905: variable 'omit' from source: magic vars 16142 1727204149.99981: variable 'omit' from source: magic vars 16142 1727204150.00032: variable 'omit' from source: magic vars 16142 1727204150.00090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204150.00150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204150.00187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204150.00211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204150.00226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204150.00278: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204150.00291: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.00303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.00444: Set connection var ansible_timeout to 10 16142 1727204150.00456: Set connection var ansible_connection to ssh 16142 1727204150.00469: Set connection var ansible_shell_type to sh 16142 1727204150.00481: Set connection var ansible_shell_executable to /bin/sh 16142 1727204150.00493: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204150.00506: Set connection var ansible_pipelining to False 16142 1727204150.00538: variable 'ansible_shell_executable' from source: unknown 16142 1727204150.00546: variable 'ansible_connection' from source: unknown 16142 1727204150.00565: variable 'ansible_module_compression' from source: unknown 16142 1727204150.00576: variable 'ansible_shell_type' from source: unknown 16142 1727204150.00587: variable 'ansible_shell_executable' from source: unknown 16142 1727204150.00598: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.00609: variable 'ansible_pipelining' from source: unknown 16142 1727204150.00617: variable 'ansible_timeout' from source: unknown 16142 1727204150.00625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.00779: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204150.00804: variable 'omit' from source: magic vars 16142 1727204150.00816: starting attempt loop 16142 1727204150.00824: running the handler 16142 1727204150.00977: variable '__network_connections_result' from source: set_fact 16142 1727204150.01041: handler run complete 16142 1727204150.01070: attempt loop complete, returning result 16142 1727204150.01083: _execute() done 16142 1727204150.01090: dumping result to json 16142 1727204150.01097: done dumping result, returning 16142 1727204150.01112: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-fddd-f6c7-00000000012d] 16142 1727204150.01121: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012d ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 16142 1727204150.01339: no more pending results, returning what we have 16142 1727204150.01343: results queue empty 16142 1727204150.01345: checking for any_errors_fatal 16142 1727204150.01354: done checking for any_errors_fatal 16142 1727204150.01355: checking for max_fail_percentage 16142 1727204150.01358: done checking for max_fail_percentage 16142 1727204150.01359: checking to see if all hosts have failed and the running result is not ok 16142 1727204150.01359: done checking to see if all hosts have failed 16142 1727204150.01360: getting the remaining hosts for this loop 16142 1727204150.01361: done getting the remaining hosts for this loop 16142 1727204150.01366: getting the next task for host managed-node2 16142 1727204150.01374: done getting next task for host managed-node2 16142 1727204150.01379: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204150.01382: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204150.01394: getting variables 16142 1727204150.01397: in VariableManager get_vars() 16142 1727204150.01461: Calling all_inventory to load vars for managed-node2 16142 1727204150.01467: Calling groups_inventory to load vars for managed-node2 16142 1727204150.01470: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204150.01482: Calling all_plugins_play to load vars for managed-node2 16142 1727204150.01485: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204150.01489: Calling groups_plugins_play to load vars for managed-node2 16142 1727204150.02843: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012d 16142 1727204150.02847: WORKER PROCESS EXITING 16142 1727204150.04342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204150.07778: done with get_vars() 16142 1727204150.07850: done getting variables 16142 1727204150.07922: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:50 -0400 (0:00:00.097) 0:00:49.256 ***** 16142 1727204150.08091: entering _queue_task() for managed-node2/debug 16142 1727204150.08683: worker is 1 (out of 1 available) 16142 1727204150.08793: exiting _queue_task() for managed-node2/debug 16142 1727204150.08805: done queuing things up, now waiting for results queue to drain 16142 1727204150.08806: waiting for pending results... 16142 1727204150.09170: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204150.09311: in run() - task 0affcd87-79f5-fddd-f6c7-00000000012e 16142 1727204150.09330: variable 'ansible_search_path' from source: unknown 16142 1727204150.09338: variable 'ansible_search_path' from source: unknown 16142 1727204150.09385: calling self._execute() 16142 1727204150.09506: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.09509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.09521: variable 'omit' from source: magic vars 16142 1727204150.10170: variable 'ansible_distribution_major_version' from source: facts 16142 1727204150.10173: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204150.10176: variable 'omit' from source: magic vars 16142 1727204150.10179: variable 'omit' from source: magic vars 16142 1727204150.10181: variable 'omit' from source: magic vars 16142 1727204150.10210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204150.10245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204150.10274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204150.10291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204150.10302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204150.10342: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204150.10345: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.10348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.10468: Set connection var ansible_timeout to 10 16142 1727204150.10472: Set connection var ansible_connection to ssh 16142 1727204150.10483: Set connection var ansible_shell_type to sh 16142 1727204150.10489: Set connection var ansible_shell_executable to /bin/sh 16142 1727204150.10494: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204150.10502: Set connection var ansible_pipelining to False 16142 1727204150.10528: variable 'ansible_shell_executable' from source: unknown 16142 1727204150.10531: variable 'ansible_connection' from source: unknown 16142 1727204150.10540: variable 'ansible_module_compression' from source: unknown 16142 1727204150.10543: variable 'ansible_shell_type' from source: unknown 16142 1727204150.10545: variable 'ansible_shell_executable' from source: unknown 16142 1727204150.10547: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.10550: variable 'ansible_pipelining' from source: unknown 16142 1727204150.10552: variable 'ansible_timeout' from source: unknown 16142 1727204150.10556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.10718: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204150.10729: variable 'omit' from source: magic vars 16142 1727204150.10734: starting attempt loop 16142 1727204150.10740: running the handler 16142 1727204150.10793: variable '__network_connections_result' from source: set_fact 16142 1727204150.10884: variable '__network_connections_result' from source: set_fact 16142 1727204150.11017: handler run complete 16142 1727204150.11047: attempt loop complete, returning result 16142 1727204150.11050: _execute() done 16142 1727204150.11053: dumping result to json 16142 1727204150.11056: done dumping result, returning 16142 1727204150.11071: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-fddd-f6c7-00000000012e] 16142 1727204150.11080: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012e 16142 1727204150.11183: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012e 16142 1727204150.11187: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 16142 1727204150.11290: no more pending results, returning what we have 16142 1727204150.11295: results queue empty 16142 1727204150.11296: checking for any_errors_fatal 16142 1727204150.11303: done checking for any_errors_fatal 16142 1727204150.11304: checking for max_fail_percentage 16142 1727204150.11307: done checking for max_fail_percentage 16142 1727204150.11308: checking to see if all hosts have failed and the running result is not ok 16142 1727204150.11309: done checking to see if all hosts have failed 16142 1727204150.11309: getting the remaining hosts for this loop 16142 1727204150.11311: done getting the remaining hosts for this loop 16142 1727204150.11315: getting the next task for host managed-node2 16142 1727204150.11321: done getting next task for host managed-node2 16142 1727204150.11326: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204150.11330: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204150.11342: getting variables 16142 1727204150.11345: in VariableManager get_vars() 16142 1727204150.11402: Calling all_inventory to load vars for managed-node2 16142 1727204150.11405: Calling groups_inventory to load vars for managed-node2 16142 1727204150.11408: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204150.11419: Calling all_plugins_play to load vars for managed-node2 16142 1727204150.11422: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204150.11425: Calling groups_plugins_play to load vars for managed-node2 16142 1727204150.14168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204150.15927: done with get_vars() 16142 1727204150.15967: done getting variables 16142 1727204150.16043: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:50 -0400 (0:00:00.083) 0:00:49.339 ***** 16142 1727204150.16296: entering _queue_task() for managed-node2/debug 16142 1727204150.16642: worker is 1 (out of 1 available) 16142 1727204150.16654: exiting _queue_task() for managed-node2/debug 16142 1727204150.16668: done queuing things up, now waiting for results queue to drain 16142 1727204150.16670: waiting for pending results... 16142 1727204150.17006: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204150.17155: in run() - task 0affcd87-79f5-fddd-f6c7-00000000012f 16142 1727204150.17181: variable 'ansible_search_path' from source: unknown 16142 1727204150.17188: variable 'ansible_search_path' from source: unknown 16142 1727204150.17229: calling self._execute() 16142 1727204150.17334: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.17352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.17370: variable 'omit' from source: magic vars 16142 1727204150.17750: variable 'ansible_distribution_major_version' from source: facts 16142 1727204150.17771: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204150.17896: variable 'network_state' from source: role '' defaults 16142 1727204150.17913: Evaluated conditional (network_state != {}): False 16142 1727204150.17921: when evaluation is False, skipping this task 16142 1727204150.17927: _execute() done 16142 1727204150.17934: dumping result to json 16142 1727204150.17946: done dumping result, returning 16142 1727204150.17958: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-fddd-f6c7-00000000012f] 16142 1727204150.17971: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012f 16142 1727204150.18093: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000012f 16142 1727204150.18103: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 16142 1727204150.18160: no more pending results, returning what we have 16142 1727204150.18166: results queue empty 16142 1727204150.18167: checking for any_errors_fatal 16142 1727204150.18175: done checking for any_errors_fatal 16142 1727204150.18176: checking for max_fail_percentage 16142 1727204150.18179: done checking for max_fail_percentage 16142 1727204150.18180: checking to see if all hosts have failed and the running result is not ok 16142 1727204150.18181: done checking to see if all hosts have failed 16142 1727204150.18182: getting the remaining hosts for this loop 16142 1727204150.18183: done getting the remaining hosts for this loop 16142 1727204150.18187: getting the next task for host managed-node2 16142 1727204150.18195: done getting next task for host managed-node2 16142 1727204150.18200: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204150.18204: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204150.18230: getting variables 16142 1727204150.18232: in VariableManager get_vars() 16142 1727204150.18306: Calling all_inventory to load vars for managed-node2 16142 1727204150.18310: Calling groups_inventory to load vars for managed-node2 16142 1727204150.18312: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204150.18327: Calling all_plugins_play to load vars for managed-node2 16142 1727204150.18331: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204150.18335: Calling groups_plugins_play to load vars for managed-node2 16142 1727204150.20566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204150.23087: done with get_vars() 16142 1727204150.23122: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:50 -0400 (0:00:00.070) 0:00:49.410 ***** 16142 1727204150.23375: entering _queue_task() for managed-node2/ping 16142 1727204150.23733: worker is 1 (out of 1 available) 16142 1727204150.23748: exiting _queue_task() for managed-node2/ping 16142 1727204150.23760: done queuing things up, now waiting for results queue to drain 16142 1727204150.23761: waiting for pending results... 16142 1727204150.24069: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204150.24259: in run() - task 0affcd87-79f5-fddd-f6c7-000000000130 16142 1727204150.24282: variable 'ansible_search_path' from source: unknown 16142 1727204150.24294: variable 'ansible_search_path' from source: unknown 16142 1727204150.24345: calling self._execute() 16142 1727204150.24450: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.24460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.24477: variable 'omit' from source: magic vars 16142 1727204150.24990: variable 'ansible_distribution_major_version' from source: facts 16142 1727204150.25176: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204150.25188: variable 'omit' from source: magic vars 16142 1727204150.25284: variable 'omit' from source: magic vars 16142 1727204150.25372: variable 'omit' from source: magic vars 16142 1727204150.25487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204150.25588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204150.25617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204150.25693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204150.25730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204150.25803: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204150.25888: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.25896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.26125: Set connection var ansible_timeout to 10 16142 1727204150.26133: Set connection var ansible_connection to ssh 16142 1727204150.26147: Set connection var ansible_shell_type to sh 16142 1727204150.26155: Set connection var ansible_shell_executable to /bin/sh 16142 1727204150.26165: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204150.26177: Set connection var ansible_pipelining to False 16142 1727204150.26318: variable 'ansible_shell_executable' from source: unknown 16142 1727204150.26327: variable 'ansible_connection' from source: unknown 16142 1727204150.26334: variable 'ansible_module_compression' from source: unknown 16142 1727204150.26343: variable 'ansible_shell_type' from source: unknown 16142 1727204150.26350: variable 'ansible_shell_executable' from source: unknown 16142 1727204150.26356: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.26363: variable 'ansible_pipelining' from source: unknown 16142 1727204150.26372: variable 'ansible_timeout' from source: unknown 16142 1727204150.26379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.26698: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204150.26737: variable 'omit' from source: magic vars 16142 1727204150.26751: starting attempt loop 16142 1727204150.26759: running the handler 16142 1727204150.26779: _low_level_execute_command(): starting 16142 1727204150.26792: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204150.28491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204150.28627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.28648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.28670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.28722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.28738: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204150.28754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.28777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204150.28790: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204150.28801: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204150.28815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.28833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.28854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.28869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.28883: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204150.28898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.28983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204150.29000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.29014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.29175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.30783: stdout chunk (state=3): >>>/root <<< 16142 1727204150.30984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.30987: stdout chunk (state=3): >>><<< 16142 1727204150.30989: stderr chunk (state=3): >>><<< 16142 1727204150.31122: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204150.31126: _low_level_execute_command(): starting 16142 1727204150.31129: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063 `" && echo ansible-tmp-1727204150.310152-19802-83530001303063="` echo /root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063 `" ) && sleep 0' 16142 1727204150.32052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.32056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.32089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204150.32093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.32096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204150.32098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.32158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.32175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.32256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.34499: stdout chunk (state=3): >>>ansible-tmp-1727204150.310152-19802-83530001303063=/root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063 <<< 16142 1727204150.34519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.34628: stderr chunk (state=3): >>><<< 16142 1727204150.34891: stdout chunk (state=3): >>><<< 16142 1727204150.35479: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204150.310152-19802-83530001303063=/root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204150.35483: variable 'ansible_module_compression' from source: unknown 16142 1727204150.35486: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 16142 1727204150.35488: variable 'ansible_facts' from source: unknown 16142 1727204150.35522: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063/AnsiballZ_ping.py 16142 1727204150.36218: Sending initial data 16142 1727204150.36234: Sent initial data (151 bytes) 16142 1727204150.37929: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204150.37959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.37997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.38039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.38087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.38105: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204150.38124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.38142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204150.38153: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204150.38163: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204150.38176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.38196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.38234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.38259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.38281: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204150.38306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.38395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204150.38455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.38477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.38548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.40275: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204150.40293: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204150.40346: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp7uf08lgf /root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063/AnsiballZ_ping.py <<< 16142 1727204150.40384: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204150.41660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.41842: stderr chunk (state=3): >>><<< 16142 1727204150.41846: stdout chunk (state=3): >>><<< 16142 1727204150.41848: done transferring module to remote 16142 1727204150.41851: _low_level_execute_command(): starting 16142 1727204150.41853: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063/ /root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063/AnsiballZ_ping.py && sleep 0' 16142 1727204150.43099: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204150.43123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.43147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.43188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.43230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.43242: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204150.43257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.43284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204150.43311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204150.43329: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204150.43350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.43378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.43410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.43422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.43432: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204150.43447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.43554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204150.43584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.43615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.43688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.45509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.45513: stdout chunk (state=3): >>><<< 16142 1727204150.45515: stderr chunk (state=3): >>><<< 16142 1727204150.45640: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204150.45644: _low_level_execute_command(): starting 16142 1727204150.45646: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063/AnsiballZ_ping.py && sleep 0' 16142 1727204150.46545: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204150.46856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.46869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.46883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.46923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.46930: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204150.46944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.46958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204150.46967: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204150.46978: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204150.46986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.46995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.47006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.47013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.47020: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204150.47030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.47107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204150.47122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.47125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.47464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.60363: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 16142 1727204150.61398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204150.61451: stderr chunk (state=3): >>><<< 16142 1727204150.61454: stdout chunk (state=3): >>><<< 16142 1727204150.61571: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204150.61576: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204150.61585: _low_level_execute_command(): starting 16142 1727204150.61588: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204150.310152-19802-83530001303063/ > /dev/null 2>&1 && sleep 0' 16142 1727204150.62152: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204150.62170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.62186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.62204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.62242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.62257: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204150.62275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.62293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204150.62305: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204150.62315: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204150.62325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.62337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.62353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.62385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.62393: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204150.62404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.62478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.62500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.62546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.64777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.64783: stderr chunk (state=3): >>><<< 16142 1727204150.64785: stdout chunk (state=3): >>><<< 16142 1727204150.64788: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204150.64790: handler run complete 16142 1727204150.64792: attempt loop complete, returning result 16142 1727204150.64793: _execute() done 16142 1727204150.64795: dumping result to json 16142 1727204150.64796: done dumping result, returning 16142 1727204150.64798: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-fddd-f6c7-000000000130] 16142 1727204150.64800: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000130 16142 1727204150.64880: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000130 16142 1727204150.64884: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 16142 1727204150.64951: no more pending results, returning what we have 16142 1727204150.64954: results queue empty 16142 1727204150.64955: checking for any_errors_fatal 16142 1727204150.64961: done checking for any_errors_fatal 16142 1727204150.64962: checking for max_fail_percentage 16142 1727204150.64965: done checking for max_fail_percentage 16142 1727204150.64966: checking to see if all hosts have failed and the running result is not ok 16142 1727204150.64967: done checking to see if all hosts have failed 16142 1727204150.64967: getting the remaining hosts for this loop 16142 1727204150.64968: done getting the remaining hosts for this loop 16142 1727204150.64971: getting the next task for host managed-node2 16142 1727204150.64986: done getting next task for host managed-node2 16142 1727204150.64989: ^ task is: TASK: meta (role_complete) 16142 1727204150.64991: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204150.65003: getting variables 16142 1727204150.65004: in VariableManager get_vars() 16142 1727204150.65058: Calling all_inventory to load vars for managed-node2 16142 1727204150.65061: Calling groups_inventory to load vars for managed-node2 16142 1727204150.65066: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204150.65078: Calling all_plugins_play to load vars for managed-node2 16142 1727204150.65081: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204150.65083: Calling groups_plugins_play to load vars for managed-node2 16142 1727204150.66582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204150.67533: done with get_vars() 16142 1727204150.67559: done getting variables 16142 1727204150.67624: done queuing things up, now waiting for results queue to drain 16142 1727204150.67626: results queue empty 16142 1727204150.67627: checking for any_errors_fatal 16142 1727204150.67630: done checking for any_errors_fatal 16142 1727204150.67630: checking for max_fail_percentage 16142 1727204150.67631: done checking for max_fail_percentage 16142 1727204150.67631: checking to see if all hosts have failed and the running result is not ok 16142 1727204150.67632: done checking to see if all hosts have failed 16142 1727204150.67632: getting the remaining hosts for this loop 16142 1727204150.67633: done getting the remaining hosts for this loop 16142 1727204150.67635: getting the next task for host managed-node2 16142 1727204150.67639: done getting next task for host managed-node2 16142 1727204150.67640: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 16142 1727204150.67641: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204150.67643: getting variables 16142 1727204150.67648: in VariableManager get_vars() 16142 1727204150.67666: Calling all_inventory to load vars for managed-node2 16142 1727204150.67667: Calling groups_inventory to load vars for managed-node2 16142 1727204150.67669: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204150.67673: Calling all_plugins_play to load vars for managed-node2 16142 1727204150.67674: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204150.67676: Calling groups_plugins_play to load vars for managed-node2 16142 1727204150.72794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204150.74197: done with get_vars() 16142 1727204150.74233: done getting variables 16142 1727204150.74278: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204150.74367: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Tuesday 24 September 2024 14:55:50 -0400 (0:00:00.510) 0:00:49.920 ***** 16142 1727204150.74398: entering _queue_task() for managed-node2/command 16142 1727204150.74799: worker is 1 (out of 1 available) 16142 1727204150.74815: exiting _queue_task() for managed-node2/command 16142 1727204150.74832: done queuing things up, now waiting for results queue to drain 16142 1727204150.74834: waiting for pending results... 16142 1727204150.75068: running TaskExecutor() for managed-node2/TASK: From the active connection, get the controller profile "bond0" 16142 1727204150.75139: in run() - task 0affcd87-79f5-fddd-f6c7-000000000160 16142 1727204150.75148: variable 'ansible_search_path' from source: unknown 16142 1727204150.75182: calling self._execute() 16142 1727204150.75267: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.75271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.75278: variable 'omit' from source: magic vars 16142 1727204150.75650: variable 'ansible_distribution_major_version' from source: facts 16142 1727204150.75673: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204150.75805: variable 'network_provider' from source: set_fact 16142 1727204150.75817: Evaluated conditional (network_provider == "nm"): True 16142 1727204150.75821: variable 'omit' from source: magic vars 16142 1727204150.75849: variable 'omit' from source: magic vars 16142 1727204150.75967: variable 'controller_profile' from source: play vars 16142 1727204150.75989: variable 'omit' from source: magic vars 16142 1727204150.76030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204150.76057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204150.76093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204150.76101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204150.76123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204150.76150: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204150.76177: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.76211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.76315: Set connection var ansible_timeout to 10 16142 1727204150.76318: Set connection var ansible_connection to ssh 16142 1727204150.76321: Set connection var ansible_shell_type to sh 16142 1727204150.76323: Set connection var ansible_shell_executable to /bin/sh 16142 1727204150.76325: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204150.76329: Set connection var ansible_pipelining to False 16142 1727204150.76355: variable 'ansible_shell_executable' from source: unknown 16142 1727204150.76366: variable 'ansible_connection' from source: unknown 16142 1727204150.76369: variable 'ansible_module_compression' from source: unknown 16142 1727204150.76371: variable 'ansible_shell_type' from source: unknown 16142 1727204150.76374: variable 'ansible_shell_executable' from source: unknown 16142 1727204150.76376: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204150.76378: variable 'ansible_pipelining' from source: unknown 16142 1727204150.76381: variable 'ansible_timeout' from source: unknown 16142 1727204150.76408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204150.76581: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204150.76585: variable 'omit' from source: magic vars 16142 1727204150.76587: starting attempt loop 16142 1727204150.76590: running the handler 16142 1727204150.76603: _low_level_execute_command(): starting 16142 1727204150.76610: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204150.77445: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.77479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.77519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.77594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204150.77607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.77616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.77686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.79277: stdout chunk (state=3): >>>/root <<< 16142 1727204150.79390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.79490: stderr chunk (state=3): >>><<< 16142 1727204150.79500: stdout chunk (state=3): >>><<< 16142 1727204150.79527: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204150.79544: _low_level_execute_command(): starting 16142 1727204150.79551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105 `" && echo ansible-tmp-1727204150.7952788-19835-172689209762105="` echo /root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105 `" ) && sleep 0' 16142 1727204150.80174: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.80179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.80187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.80218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.80230: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204150.80233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.80249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204150.80256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204150.80262: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.80272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.80281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.80289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.80294: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204150.80300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.80391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204150.80398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.80405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.80463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.82299: stdout chunk (state=3): >>>ansible-tmp-1727204150.7952788-19835-172689209762105=/root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105 <<< 16142 1727204150.82419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.82474: stderr chunk (state=3): >>><<< 16142 1727204150.82478: stdout chunk (state=3): >>><<< 16142 1727204150.82495: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204150.7952788-19835-172689209762105=/root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204150.82526: variable 'ansible_module_compression' from source: unknown 16142 1727204150.82574: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204150.82607: variable 'ansible_facts' from source: unknown 16142 1727204150.82660: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105/AnsiballZ_command.py 16142 1727204150.82774: Sending initial data 16142 1727204150.82779: Sent initial data (156 bytes) 16142 1727204150.83519: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.83524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.83572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204150.83575: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.83578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.83580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.83626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.83635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.83693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.85459: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 16142 1727204150.85470: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 16142 1727204150.85475: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 16142 1727204150.85477: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 16142 1727204150.85480: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 16142 1727204150.85555: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 16142 1727204150.85605: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204150.85640: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204150.85685: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpca75xzw0 /root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105/AnsiballZ_command.py <<< 16142 1727204150.85768: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204150.87073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.87273: stderr chunk (state=3): >>><<< 16142 1727204150.87276: stdout chunk (state=3): >>><<< 16142 1727204150.87279: done transferring module to remote 16142 1727204150.87281: _low_level_execute_command(): starting 16142 1727204150.87283: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105/ /root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105/AnsiballZ_command.py && sleep 0' 16142 1727204150.87916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.87938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.87971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.87977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204150.87985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.88003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204150.88007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.88070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204150.88074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.88118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204150.89816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204150.89879: stderr chunk (state=3): >>><<< 16142 1727204150.89882: stdout chunk (state=3): >>><<< 16142 1727204150.89894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204150.89897: _low_level_execute_command(): starting 16142 1727204150.89903: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105/AnsiballZ_command.py && sleep 0' 16142 1727204150.90438: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.90442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204150.90477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.90504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204150.90507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204150.90567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204150.90586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204150.90590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204150.90674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204151.06061: stdout chunk (state=3): >>> {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: da90ddbf-a91a-40cb-8cf8-f4fc8a58a465\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1727204142\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: da90ddbf-a91a-40cb-8cf8-f4fc8a58a465\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/23\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.113/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.ROUTE[2]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:8e:40:12:6d:28:cd\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: <<< 16142 1727204151.06076: stdout chunk (state=3): >>> domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1727204382\nDHCP4.OPTION[7]: host_name = managed-node2\nDHCP4.OPTION[8]: ip_address = 192.0.2.113\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::1df/128\nIP6.ADDRESS[2]: 2001:db8::587b:8112:92d2:14be/64\nIP6.ADDRESS[3]: fe80::9d05:3ad9:b812:ee53/64\nIP6.GATEWAY: fe80::bc1b:82ff:fef4:eeec\nIP6.ROUTE[1]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = ::/0, nh = fe80::bc1b:82ff:fef4:eeec, mt = 300\nIP6.ROUTE[4]: dst = 2001:db8::1df/128, nh = ::, mt = 300\nIP6.DNS[1]: 2001:db8::4057:7fff:fe5e:3b7e\nIP6.DNS[2]: fe80::bc1b:82ff:fef4:eeec\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:5c:04:34:81:38:a6:fd:7a:af:ae:0b:77:c9:49:63:73\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::4057:7fff:fe5e:3b7e\nDHCP6.OPTION[3]: fqdn_fqdn = managed-node2\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::1df", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-24 14:55:51.038271", "end": "2024-09-24 14:55:51.059359", "delta": "0:00:00.021088", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204151.07286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204151.07329: stderr chunk (state=3): >>><<< 16142 1727204151.07333: stdout chunk (state=3): >>><<< 16142 1727204151.07541: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: da90ddbf-a91a-40cb-8cf8-f4fc8a58a465\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1727204142\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: da90ddbf-a91a-40cb-8cf8-f4fc8a58a465\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/23\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.113/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.ROUTE[2]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:8e:40:12:6d:28:cd\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1727204382\nDHCP4.OPTION[7]: host_name = managed-node2\nDHCP4.OPTION[8]: ip_address = 192.0.2.113\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::1df/128\nIP6.ADDRESS[2]: 2001:db8::587b:8112:92d2:14be/64\nIP6.ADDRESS[3]: fe80::9d05:3ad9:b812:ee53/64\nIP6.GATEWAY: fe80::bc1b:82ff:fef4:eeec\nIP6.ROUTE[1]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = ::/0, nh = fe80::bc1b:82ff:fef4:eeec, mt = 300\nIP6.ROUTE[4]: dst = 2001:db8::1df/128, nh = ::, mt = 300\nIP6.DNS[1]: 2001:db8::4057:7fff:fe5e:3b7e\nIP6.DNS[2]: fe80::bc1b:82ff:fef4:eeec\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:5c:04:34:81:38:a6:fd:7a:af:ae:0b:77:c9:49:63:73\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::4057:7fff:fe5e:3b7e\nDHCP6.OPTION[3]: fqdn_fqdn = managed-node2\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::1df", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-24 14:55:51.038271", "end": "2024-09-24 14:55:51.059359", "delta": "0:00:00.021088", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204151.07552: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204151.07556: _low_level_execute_command(): starting 16142 1727204151.07558: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204150.7952788-19835-172689209762105/ > /dev/null 2>&1 && sleep 0' 16142 1727204151.09084: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204151.09292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204151.09329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204151.09399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204151.09478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204151.09589: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204151.09610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204151.09631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204151.09646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204151.09656: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204151.09669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204151.09681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204151.09695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204151.09705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204151.09719: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204151.09733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204151.10021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204151.10040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204151.10055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204151.10245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204151.12505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204151.12511: stdout chunk (state=3): >>><<< 16142 1727204151.12514: stderr chunk (state=3): >>><<< 16142 1727204151.12571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204151.12574: handler run complete 16142 1727204151.12771: Evaluated conditional (False): False 16142 1727204151.12775: attempt loop complete, returning result 16142 1727204151.12777: _execute() done 16142 1727204151.12779: dumping result to json 16142 1727204151.12781: done dumping result, returning 16142 1727204151.12783: done running TaskExecutor() for managed-node2/TASK: From the active connection, get the controller profile "bond0" [0affcd87-79f5-fddd-f6c7-000000000160] 16142 1727204151.12785: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000160 ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0" ], "delta": "0:00:00.021088", "end": "2024-09-24 14:55:51.059359", "rc": 0, "start": "2024-09-24 14:55:51.038271" } STDOUT: connection.id: bond0 connection.uuid: da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 connection.stable-id: -- connection.type: bond connection.interface-name: nm-bond connection.autoconnect: yes connection.autoconnect-priority: 0 connection.autoconnect-retries: -1 (default) connection.multi-connect: 0 (default) connection.auth-retries: -1 connection.timestamp: 1727204142 connection.permissions: -- connection.zone: -- connection.controller: -- connection.master: -- connection.slave-type: -- connection.port-type: -- connection.autoconnect-slaves: -1 (default) connection.autoconnect-ports: -1 (default) connection.down-on-poweroff: -1 (default) connection.secondaries: -- connection.gateway-ping-timeout: 0 connection.metered: unknown connection.lldp: default connection.mdns: -1 (default) connection.llmnr: -1 (default) connection.dns-over-tls: -1 (default) connection.mptcp-flags: 0x0 (default) connection.wait-device-timeout: -1 connection.wait-activation-delay: -1 ipv4.method: auto ipv4.dns: -- ipv4.dns-search: -- ipv4.dns-options: -- ipv4.dns-priority: 0 ipv4.addresses: -- ipv4.gateway: -- ipv4.routes: -- ipv4.route-metric: 65535 ipv4.route-table: 0 (unspec) ipv4.routing-rules: -- ipv4.replace-local-rule: -1 (default) ipv4.dhcp-send-release: -1 (default) ipv4.ignore-auto-routes: no ipv4.ignore-auto-dns: no ipv4.dhcp-client-id: -- ipv4.dhcp-iaid: -- ipv4.dhcp-dscp: -- ipv4.dhcp-timeout: 0 (default) ipv4.dhcp-send-hostname: yes ipv4.dhcp-hostname: -- ipv4.dhcp-fqdn: -- ipv4.dhcp-hostname-flags: 0x0 (none) ipv4.never-default: no ipv4.may-fail: yes ipv4.required-timeout: -1 (default) ipv4.dad-timeout: -1 (default) ipv4.dhcp-vendor-class-identifier: -- ipv4.link-local: 0 (default) ipv4.dhcp-reject-servers: -- ipv4.auto-route-ext-gw: -1 (default) ipv6.method: auto ipv6.dns: -- ipv6.dns-search: -- ipv6.dns-options: -- ipv6.dns-priority: 0 ipv6.addresses: -- ipv6.gateway: -- ipv6.routes: -- ipv6.route-metric: -1 ipv6.route-table: 0 (unspec) ipv6.routing-rules: -- ipv6.replace-local-rule: -1 (default) ipv6.dhcp-send-release: -1 (default) ipv6.ignore-auto-routes: no ipv6.ignore-auto-dns: no ipv6.never-default: no ipv6.may-fail: yes ipv6.required-timeout: -1 (default) ipv6.ip6-privacy: -1 (default) ipv6.temp-valid-lifetime: 0 (default) ipv6.temp-preferred-lifetime: 0 (default) ipv6.addr-gen-mode: default ipv6.ra-timeout: 0 (default) ipv6.mtu: auto ipv6.dhcp-pd-hint: -- ipv6.dhcp-duid: -- ipv6.dhcp-iaid: -- ipv6.dhcp-timeout: 0 (default) ipv6.dhcp-send-hostname: yes ipv6.dhcp-hostname: -- ipv6.dhcp-hostname-flags: 0x0 (none) ipv6.auto-route-ext-gw: -1 (default) ipv6.token: -- bond.options: mode=active-backup,miimon=110 proxy.method: none proxy.browser-only: no proxy.pac-url: -- proxy.pac-script: -- GENERAL.NAME: bond0 GENERAL.UUID: da90ddbf-a91a-40cb-8cf8-f4fc8a58a465 GENERAL.DEVICES: nm-bond GENERAL.IP-IFACE: nm-bond GENERAL.STATE: activated GENERAL.DEFAULT: no GENERAL.DEFAULT6: yes GENERAL.SPEC-OBJECT: -- GENERAL.VPN: no GENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/23 GENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18 GENERAL.ZONE: -- GENERAL.MASTER-PATH: -- IP4.ADDRESS[1]: 192.0.2.113/24 IP4.GATEWAY: 192.0.2.1 IP4.ROUTE[1]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535 IP4.ROUTE[2]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535 IP4.DNS[1]: 192.0.2.1 DHCP4.OPTION[1]: broadcast_address = 192.0.2.255 DHCP4.OPTION[2]: dhcp_client_identifier = 01:8e:40:12:6d:28:cd DHCP4.OPTION[3]: dhcp_lease_time = 240 DHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1 DHCP4.OPTION[5]: domain_name_servers = 192.0.2.1 DHCP4.OPTION[6]: expiry = 1727204382 DHCP4.OPTION[7]: host_name = managed-node2 DHCP4.OPTION[8]: ip_address = 192.0.2.113 DHCP4.OPTION[9]: next_server = 192.0.2.1 DHCP4.OPTION[10]: requested_broadcast_address = 1 DHCP4.OPTION[11]: requested_domain_name = 1 DHCP4.OPTION[12]: requested_domain_name_servers = 1 DHCP4.OPTION[13]: requested_domain_search = 1 DHCP4.OPTION[14]: requested_host_name = 1 DHCP4.OPTION[15]: requested_interface_mtu = 1 DHCP4.OPTION[16]: requested_ms_classless_static_routes = 1 DHCP4.OPTION[17]: requested_nis_domain = 1 DHCP4.OPTION[18]: requested_nis_servers = 1 DHCP4.OPTION[19]: requested_ntp_servers = 1 DHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1 DHCP4.OPTION[21]: requested_root_path = 1 DHCP4.OPTION[22]: requested_routers = 1 DHCP4.OPTION[23]: requested_static_routes = 1 DHCP4.OPTION[24]: requested_subnet_mask = 1 DHCP4.OPTION[25]: requested_time_offset = 1 DHCP4.OPTION[26]: requested_wpad = 1 DHCP4.OPTION[27]: routers = 192.0.2.1 DHCP4.OPTION[28]: subnet_mask = 255.255.255.0 IP6.ADDRESS[1]: 2001:db8::1df/128 IP6.ADDRESS[2]: 2001:db8::587b:8112:92d2:14be/64 IP6.ADDRESS[3]: fe80::9d05:3ad9:b812:ee53/64 IP6.GATEWAY: fe80::bc1b:82ff:fef4:eeec IP6.ROUTE[1]: dst = fe80::/64, nh = ::, mt = 1024 IP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300 IP6.ROUTE[3]: dst = ::/0, nh = fe80::bc1b:82ff:fef4:eeec, mt = 300 IP6.ROUTE[4]: dst = 2001:db8::1df/128, nh = ::, mt = 300 IP6.DNS[1]: 2001:db8::4057:7fff:fe5e:3b7e IP6.DNS[2]: fe80::bc1b:82ff:fef4:eeec DHCP6.OPTION[1]: dhcp6_client_id = 00:04:5c:04:34:81:38:a6:fd:7a:af:ae:0b:77:c9:49:63:73 DHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::4057:7fff:fe5e:3b7e DHCP6.OPTION[3]: fqdn_fqdn = managed-node2 DHCP6.OPTION[4]: iaid = 8c:3b:13:c0 DHCP6.OPTION[5]: ip6_address = 2001:db8::1df 16142 1727204151.13015: no more pending results, returning what we have 16142 1727204151.13019: results queue empty 16142 1727204151.13019: checking for any_errors_fatal 16142 1727204151.13021: done checking for any_errors_fatal 16142 1727204151.13022: checking for max_fail_percentage 16142 1727204151.13023: done checking for max_fail_percentage 16142 1727204151.13024: checking to see if all hosts have failed and the running result is not ok 16142 1727204151.13025: done checking to see if all hosts have failed 16142 1727204151.13026: getting the remaining hosts for this loop 16142 1727204151.13027: done getting the remaining hosts for this loop 16142 1727204151.13030: getting the next task for host managed-node2 16142 1727204151.13037: done getting next task for host managed-node2 16142 1727204151.13039: ^ task is: TASK: Assert that the controller profile is activated 16142 1727204151.13041: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204151.13045: getting variables 16142 1727204151.13047: in VariableManager get_vars() 16142 1727204151.13104: Calling all_inventory to load vars for managed-node2 16142 1727204151.13107: Calling groups_inventory to load vars for managed-node2 16142 1727204151.13109: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204151.13119: Calling all_plugins_play to load vars for managed-node2 16142 1727204151.13122: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204151.13124: Calling groups_plugins_play to load vars for managed-node2 16142 1727204151.13824: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000160 16142 1727204151.13827: WORKER PROCESS EXITING 16142 1727204151.15198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204151.21394: done with get_vars() 16142 1727204151.21547: done getting variables 16142 1727204151.21617: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Tuesday 24 September 2024 14:55:51 -0400 (0:00:00.473) 0:00:50.394 ***** 16142 1727204151.21770: entering _queue_task() for managed-node2/assert 16142 1727204151.22566: worker is 1 (out of 1 available) 16142 1727204151.22581: exiting _queue_task() for managed-node2/assert 16142 1727204151.22594: done queuing things up, now waiting for results queue to drain 16142 1727204151.22595: waiting for pending results... 16142 1727204151.23667: running TaskExecutor() for managed-node2/TASK: Assert that the controller profile is activated 16142 1727204151.24054: in run() - task 0affcd87-79f5-fddd-f6c7-000000000161 16142 1727204151.24165: variable 'ansible_search_path' from source: unknown 16142 1727204151.24212: calling self._execute() 16142 1727204151.24440: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204151.24444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204151.24455: variable 'omit' from source: magic vars 16142 1727204151.25497: variable 'ansible_distribution_major_version' from source: facts 16142 1727204151.25511: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204151.25907: variable 'network_provider' from source: set_fact 16142 1727204151.25913: Evaluated conditional (network_provider == "nm"): True 16142 1727204151.25984: variable 'omit' from source: magic vars 16142 1727204151.26015: variable 'omit' from source: magic vars 16142 1727204151.26323: variable 'controller_profile' from source: play vars 16142 1727204151.26345: variable 'omit' from source: magic vars 16142 1727204151.26396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204151.26446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204151.26486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204151.26507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204151.26521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204151.26561: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204151.26575: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204151.26587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204151.26713: Set connection var ansible_timeout to 10 16142 1727204151.26721: Set connection var ansible_connection to ssh 16142 1727204151.26730: Set connection var ansible_shell_type to sh 16142 1727204151.26744: Set connection var ansible_shell_executable to /bin/sh 16142 1727204151.26754: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204151.26767: Set connection var ansible_pipelining to False 16142 1727204151.26799: variable 'ansible_shell_executable' from source: unknown 16142 1727204151.26809: variable 'ansible_connection' from source: unknown 16142 1727204151.26817: variable 'ansible_module_compression' from source: unknown 16142 1727204151.26823: variable 'ansible_shell_type' from source: unknown 16142 1727204151.26829: variable 'ansible_shell_executable' from source: unknown 16142 1727204151.26838: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204151.26846: variable 'ansible_pipelining' from source: unknown 16142 1727204151.26853: variable 'ansible_timeout' from source: unknown 16142 1727204151.26861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204151.27016: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204151.27034: variable 'omit' from source: magic vars 16142 1727204151.27048: starting attempt loop 16142 1727204151.27056: running the handler 16142 1727204151.27244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204151.31852: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204151.32143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204151.32234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204151.32283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204151.32315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204151.32397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204151.32434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204151.32471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204151.32612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204151.32630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204151.33054: variable 'active_controller_profile' from source: set_fact 16142 1727204151.33094: Evaluated conditional (active_controller_profile.stdout | length != 0): True 16142 1727204151.33106: handler run complete 16142 1727204151.33127: attempt loop complete, returning result 16142 1727204151.33134: _execute() done 16142 1727204151.33143: dumping result to json 16142 1727204151.33150: done dumping result, returning 16142 1727204151.33161: done running TaskExecutor() for managed-node2/TASK: Assert that the controller profile is activated [0affcd87-79f5-fddd-f6c7-000000000161] 16142 1727204151.33180: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000161 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16142 1727204151.33404: no more pending results, returning what we have 16142 1727204151.33407: results queue empty 16142 1727204151.33408: checking for any_errors_fatal 16142 1727204151.33421: done checking for any_errors_fatal 16142 1727204151.33422: checking for max_fail_percentage 16142 1727204151.33424: done checking for max_fail_percentage 16142 1727204151.33425: checking to see if all hosts have failed and the running result is not ok 16142 1727204151.33425: done checking to see if all hosts have failed 16142 1727204151.33426: getting the remaining hosts for this loop 16142 1727204151.33427: done getting the remaining hosts for this loop 16142 1727204151.33431: getting the next task for host managed-node2 16142 1727204151.33437: done getting next task for host managed-node2 16142 1727204151.33441: ^ task is: TASK: Get the controller device details 16142 1727204151.33443: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204151.33446: getting variables 16142 1727204151.33448: in VariableManager get_vars() 16142 1727204151.33512: Calling all_inventory to load vars for managed-node2 16142 1727204151.33522: Calling groups_inventory to load vars for managed-node2 16142 1727204151.33525: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204151.33536: Calling all_plugins_play to load vars for managed-node2 16142 1727204151.33539: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204151.33542: Calling groups_plugins_play to load vars for managed-node2 16142 1727204151.34590: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000161 16142 1727204151.34597: WORKER PROCESS EXITING 16142 1727204151.36631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204151.40258: done with get_vars() 16142 1727204151.40296: done getting variables 16142 1727204151.40404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Tuesday 24 September 2024 14:55:51 -0400 (0:00:00.187) 0:00:50.582 ***** 16142 1727204151.40556: entering _queue_task() for managed-node2/command 16142 1727204151.41267: worker is 1 (out of 1 available) 16142 1727204151.41281: exiting _queue_task() for managed-node2/command 16142 1727204151.41294: done queuing things up, now waiting for results queue to drain 16142 1727204151.41295: waiting for pending results... 16142 1727204151.42070: running TaskExecutor() for managed-node2/TASK: Get the controller device details 16142 1727204151.42381: in run() - task 0affcd87-79f5-fddd-f6c7-000000000162 16142 1727204151.42489: variable 'ansible_search_path' from source: unknown 16142 1727204151.42531: calling self._execute() 16142 1727204151.42770: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204151.42880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204151.42895: variable 'omit' from source: magic vars 16142 1727204151.43786: variable 'ansible_distribution_major_version' from source: facts 16142 1727204151.43805: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204151.43931: variable 'network_provider' from source: set_fact 16142 1727204151.43947: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204151.43955: when evaluation is False, skipping this task 16142 1727204151.43961: _execute() done 16142 1727204151.43972: dumping result to json 16142 1727204151.43979: done dumping result, returning 16142 1727204151.43989: done running TaskExecutor() for managed-node2/TASK: Get the controller device details [0affcd87-79f5-fddd-f6c7-000000000162] 16142 1727204151.43998: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000162 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204151.44171: no more pending results, returning what we have 16142 1727204151.44176: results queue empty 16142 1727204151.44178: checking for any_errors_fatal 16142 1727204151.44187: done checking for any_errors_fatal 16142 1727204151.44188: checking for max_fail_percentage 16142 1727204151.44191: done checking for max_fail_percentage 16142 1727204151.44192: checking to see if all hosts have failed and the running result is not ok 16142 1727204151.44193: done checking to see if all hosts have failed 16142 1727204151.44194: getting the remaining hosts for this loop 16142 1727204151.44195: done getting the remaining hosts for this loop 16142 1727204151.44199: getting the next task for host managed-node2 16142 1727204151.44206: done getting next task for host managed-node2 16142 1727204151.44208: ^ task is: TASK: Assert that the controller profile is activated 16142 1727204151.44211: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204151.44215: getting variables 16142 1727204151.44216: in VariableManager get_vars() 16142 1727204151.44275: Calling all_inventory to load vars for managed-node2 16142 1727204151.44277: Calling groups_inventory to load vars for managed-node2 16142 1727204151.44279: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204151.44294: Calling all_plugins_play to load vars for managed-node2 16142 1727204151.44297: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204151.44302: Calling groups_plugins_play to load vars for managed-node2 16142 1727204151.44912: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000162 16142 1727204151.44916: WORKER PROCESS EXITING 16142 1727204151.47324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204151.50302: done with get_vars() 16142 1727204151.50359: done getting variables 16142 1727204151.50462: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Tuesday 24 September 2024 14:55:51 -0400 (0:00:00.099) 0:00:50.681 ***** 16142 1727204151.50498: entering _queue_task() for managed-node2/assert 16142 1727204151.50902: worker is 1 (out of 1 available) 16142 1727204151.50916: exiting _queue_task() for managed-node2/assert 16142 1727204151.50927: done queuing things up, now waiting for results queue to drain 16142 1727204151.50928: waiting for pending results... 16142 1727204151.51255: running TaskExecutor() for managed-node2/TASK: Assert that the controller profile is activated 16142 1727204151.51362: in run() - task 0affcd87-79f5-fddd-f6c7-000000000163 16142 1727204151.51382: variable 'ansible_search_path' from source: unknown 16142 1727204151.51430: calling self._execute() 16142 1727204151.51565: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204151.51572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204151.51581: variable 'omit' from source: magic vars 16142 1727204151.52025: variable 'ansible_distribution_major_version' from source: facts 16142 1727204151.52041: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204151.52178: variable 'network_provider' from source: set_fact 16142 1727204151.52189: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204151.52193: when evaluation is False, skipping this task 16142 1727204151.52195: _execute() done 16142 1727204151.52206: dumping result to json 16142 1727204151.52209: done dumping result, returning 16142 1727204151.52217: done running TaskExecutor() for managed-node2/TASK: Assert that the controller profile is activated [0affcd87-79f5-fddd-f6c7-000000000163] 16142 1727204151.52222: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000163 16142 1727204151.52333: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000163 16142 1727204151.52338: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204151.52397: no more pending results, returning what we have 16142 1727204151.52401: results queue empty 16142 1727204151.52402: checking for any_errors_fatal 16142 1727204151.52409: done checking for any_errors_fatal 16142 1727204151.52410: checking for max_fail_percentage 16142 1727204151.52415: done checking for max_fail_percentage 16142 1727204151.52416: checking to see if all hosts have failed and the running result is not ok 16142 1727204151.52417: done checking to see if all hosts have failed 16142 1727204151.52418: getting the remaining hosts for this loop 16142 1727204151.52419: done getting the remaining hosts for this loop 16142 1727204151.52424: getting the next task for host managed-node2 16142 1727204151.52437: done getting next task for host managed-node2 16142 1727204151.52444: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204151.52448: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204151.52477: getting variables 16142 1727204151.52480: in VariableManager get_vars() 16142 1727204151.52540: Calling all_inventory to load vars for managed-node2 16142 1727204151.52543: Calling groups_inventory to load vars for managed-node2 16142 1727204151.52545: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204151.52558: Calling all_plugins_play to load vars for managed-node2 16142 1727204151.52560: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204151.52565: Calling groups_plugins_play to load vars for managed-node2 16142 1727204151.54842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204151.57947: done with get_vars() 16142 1727204151.57982: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:51 -0400 (0:00:00.075) 0:00:50.757 ***** 16142 1727204151.58096: entering _queue_task() for managed-node2/include_tasks 16142 1727204151.58433: worker is 1 (out of 1 available) 16142 1727204151.58450: exiting _queue_task() for managed-node2/include_tasks 16142 1727204151.58461: done queuing things up, now waiting for results queue to drain 16142 1727204151.58463: waiting for pending results... 16142 1727204151.58760: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16142 1727204151.58925: in run() - task 0affcd87-79f5-fddd-f6c7-00000000016c 16142 1727204151.58948: variable 'ansible_search_path' from source: unknown 16142 1727204151.58955: variable 'ansible_search_path' from source: unknown 16142 1727204151.59002: calling self._execute() 16142 1727204151.59114: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204151.59129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204151.59146: variable 'omit' from source: magic vars 16142 1727204151.59758: variable 'ansible_distribution_major_version' from source: facts 16142 1727204151.59796: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204151.59898: _execute() done 16142 1727204151.59907: dumping result to json 16142 1727204151.59916: done dumping result, returning 16142 1727204151.59928: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-fddd-f6c7-00000000016c] 16142 1727204151.59942: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000016c 16142 1727204151.60099: no more pending results, returning what we have 16142 1727204151.60105: in VariableManager get_vars() 16142 1727204151.60172: Calling all_inventory to load vars for managed-node2 16142 1727204151.60175: Calling groups_inventory to load vars for managed-node2 16142 1727204151.60178: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204151.60190: Calling all_plugins_play to load vars for managed-node2 16142 1727204151.60194: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204151.60198: Calling groups_plugins_play to load vars for managed-node2 16142 1727204151.61384: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000016c 16142 1727204151.61388: WORKER PROCESS EXITING 16142 1727204151.63824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204151.66480: done with get_vars() 16142 1727204151.66512: variable 'ansible_search_path' from source: unknown 16142 1727204151.66514: variable 'ansible_search_path' from source: unknown 16142 1727204151.66561: we have included files to process 16142 1727204151.66563: generating all_blocks data 16142 1727204151.66567: done generating all_blocks data 16142 1727204151.66573: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204151.66574: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204151.66577: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16142 1727204151.67198: done processing included file 16142 1727204151.67200: iterating over new_blocks loaded from include file 16142 1727204151.67201: in VariableManager get_vars() 16142 1727204151.67242: done with get_vars() 16142 1727204151.67244: filtering new block on tags 16142 1727204151.67278: done filtering new block on tags 16142 1727204151.67281: in VariableManager get_vars() 16142 1727204151.67316: done with get_vars() 16142 1727204151.67318: filtering new block on tags 16142 1727204151.67365: done filtering new block on tags 16142 1727204151.67368: in VariableManager get_vars() 16142 1727204151.67400: done with get_vars() 16142 1727204151.67402: filtering new block on tags 16142 1727204151.67445: done filtering new block on tags 16142 1727204151.67447: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 16142 1727204151.67453: extending task lists for all hosts with included blocks 16142 1727204151.69403: done extending task lists 16142 1727204151.69405: done processing included files 16142 1727204151.69406: results queue empty 16142 1727204151.69407: checking for any_errors_fatal 16142 1727204151.69410: done checking for any_errors_fatal 16142 1727204151.69411: checking for max_fail_percentage 16142 1727204151.69412: done checking for max_fail_percentage 16142 1727204151.69413: checking to see if all hosts have failed and the running result is not ok 16142 1727204151.69414: done checking to see if all hosts have failed 16142 1727204151.69415: getting the remaining hosts for this loop 16142 1727204151.69416: done getting the remaining hosts for this loop 16142 1727204151.69419: getting the next task for host managed-node2 16142 1727204151.69424: done getting next task for host managed-node2 16142 1727204151.69427: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204151.69431: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204151.69447: getting variables 16142 1727204151.69448: in VariableManager get_vars() 16142 1727204151.69478: Calling all_inventory to load vars for managed-node2 16142 1727204151.69481: Calling groups_inventory to load vars for managed-node2 16142 1727204151.69483: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204151.69490: Calling all_plugins_play to load vars for managed-node2 16142 1727204151.69492: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204151.69495: Calling groups_plugins_play to load vars for managed-node2 16142 1727204151.70994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204151.72756: done with get_vars() 16142 1727204151.72784: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:51 -0400 (0:00:00.147) 0:00:50.905 ***** 16142 1727204151.72891: entering _queue_task() for managed-node2/setup 16142 1727204151.74849: worker is 1 (out of 1 available) 16142 1727204151.74875: exiting _queue_task() for managed-node2/setup 16142 1727204151.74891: done queuing things up, now waiting for results queue to drain 16142 1727204151.74895: waiting for pending results... 16142 1727204151.75452: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16142 1727204151.76001: in run() - task 0affcd87-79f5-fddd-f6c7-000000000914 16142 1727204151.76038: variable 'ansible_search_path' from source: unknown 16142 1727204151.76050: variable 'ansible_search_path' from source: unknown 16142 1727204151.76171: calling self._execute() 16142 1727204151.76424: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204151.76561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204151.76579: variable 'omit' from source: magic vars 16142 1727204151.77312: variable 'ansible_distribution_major_version' from source: facts 16142 1727204151.77444: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204151.77990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204151.83376: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204151.83489: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204151.83676: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204151.83719: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204151.83877: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204151.84087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204151.84142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204151.84185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204151.84328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204151.84357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204151.84542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204151.84575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204151.84720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204151.84770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204151.84791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204151.85100: variable '__network_required_facts' from source: role '' defaults 16142 1727204151.85171: variable 'ansible_facts' from source: unknown 16142 1727204151.86895: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 16142 1727204151.87030: when evaluation is False, skipping this task 16142 1727204151.87053: _execute() done 16142 1727204151.87072: dumping result to json 16142 1727204151.87091: done dumping result, returning 16142 1727204151.87112: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-fddd-f6c7-000000000914] 16142 1727204151.87140: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000914 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204151.87398: no more pending results, returning what we have 16142 1727204151.87405: results queue empty 16142 1727204151.87406: checking for any_errors_fatal 16142 1727204151.87410: done checking for any_errors_fatal 16142 1727204151.87411: checking for max_fail_percentage 16142 1727204151.87413: done checking for max_fail_percentage 16142 1727204151.87414: checking to see if all hosts have failed and the running result is not ok 16142 1727204151.87415: done checking to see if all hosts have failed 16142 1727204151.87418: getting the remaining hosts for this loop 16142 1727204151.87419: done getting the remaining hosts for this loop 16142 1727204151.87427: getting the next task for host managed-node2 16142 1727204151.87441: done getting next task for host managed-node2 16142 1727204151.87448: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204151.87460: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204151.87499: getting variables 16142 1727204151.87502: in VariableManager get_vars() 16142 1727204151.87586: Calling all_inventory to load vars for managed-node2 16142 1727204151.87589: Calling groups_inventory to load vars for managed-node2 16142 1727204151.87595: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204151.87612: Calling all_plugins_play to load vars for managed-node2 16142 1727204151.87618: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204151.87625: Calling groups_plugins_play to load vars for managed-node2 16142 1727204151.89372: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000914 16142 1727204151.89377: WORKER PROCESS EXITING 16142 1727204151.90829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204151.93556: done with get_vars() 16142 1727204151.93597: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:51 -0400 (0:00:00.215) 0:00:51.121 ***** 16142 1727204151.94433: entering _queue_task() for managed-node2/stat 16142 1727204151.94786: worker is 1 (out of 1 available) 16142 1727204151.94798: exiting _queue_task() for managed-node2/stat 16142 1727204151.94811: done queuing things up, now waiting for results queue to drain 16142 1727204151.94812: waiting for pending results... 16142 1727204151.96465: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 16142 1727204151.96978: in run() - task 0affcd87-79f5-fddd-f6c7-000000000916 16142 1727204151.96993: variable 'ansible_search_path' from source: unknown 16142 1727204151.96996: variable 'ansible_search_path' from source: unknown 16142 1727204151.97046: calling self._execute() 16142 1727204151.97343: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204151.97351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204151.97363: variable 'omit' from source: magic vars 16142 1727204151.98321: variable 'ansible_distribution_major_version' from source: facts 16142 1727204151.98341: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204151.98628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204151.99331: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204151.99394: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204151.99431: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204151.99465: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204151.99762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204151.99798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204151.99824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204151.99968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204152.00177: variable '__network_is_ostree' from source: set_fact 16142 1727204152.00185: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204152.00188: when evaluation is False, skipping this task 16142 1727204152.00190: _execute() done 16142 1727204152.00195: dumping result to json 16142 1727204152.00197: done dumping result, returning 16142 1727204152.00207: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-fddd-f6c7-000000000916] 16142 1727204152.00214: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000916 16142 1727204152.00315: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000916 16142 1727204152.00318: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204152.00375: no more pending results, returning what we have 16142 1727204152.00380: results queue empty 16142 1727204152.00381: checking for any_errors_fatal 16142 1727204152.00387: done checking for any_errors_fatal 16142 1727204152.00388: checking for max_fail_percentage 16142 1727204152.00390: done checking for max_fail_percentage 16142 1727204152.00391: checking to see if all hosts have failed and the running result is not ok 16142 1727204152.00391: done checking to see if all hosts have failed 16142 1727204152.00392: getting the remaining hosts for this loop 16142 1727204152.00393: done getting the remaining hosts for this loop 16142 1727204152.00397: getting the next task for host managed-node2 16142 1727204152.00404: done getting next task for host managed-node2 16142 1727204152.00408: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204152.00415: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204152.00439: getting variables 16142 1727204152.00441: in VariableManager get_vars() 16142 1727204152.00498: Calling all_inventory to load vars for managed-node2 16142 1727204152.00501: Calling groups_inventory to load vars for managed-node2 16142 1727204152.00503: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204152.00514: Calling all_plugins_play to load vars for managed-node2 16142 1727204152.00516: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204152.00519: Calling groups_plugins_play to load vars for managed-node2 16142 1727204152.04281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204152.09778: done with get_vars() 16142 1727204152.09815: done getting variables 16142 1727204152.09887: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.154) 0:00:51.276 ***** 16142 1727204152.09930: entering _queue_task() for managed-node2/set_fact 16142 1727204152.10302: worker is 1 (out of 1 available) 16142 1727204152.10314: exiting _queue_task() for managed-node2/set_fact 16142 1727204152.10327: done queuing things up, now waiting for results queue to drain 16142 1727204152.10328: waiting for pending results... 16142 1727204152.11300: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16142 1727204152.11720: in run() - task 0affcd87-79f5-fddd-f6c7-000000000917 16142 1727204152.11879: variable 'ansible_search_path' from source: unknown 16142 1727204152.11887: variable 'ansible_search_path' from source: unknown 16142 1727204152.11929: calling self._execute() 16142 1727204152.12048: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204152.12191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204152.12206: variable 'omit' from source: magic vars 16142 1727204152.12931: variable 'ansible_distribution_major_version' from source: facts 16142 1727204152.13069: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204152.13356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204152.13993: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204152.14055: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204152.14098: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204152.14143: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204152.14239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204152.14274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204152.14307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204152.14341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204152.14444: variable '__network_is_ostree' from source: set_fact 16142 1727204152.14458: Evaluated conditional (not __network_is_ostree is defined): False 16142 1727204152.14469: when evaluation is False, skipping this task 16142 1727204152.14477: _execute() done 16142 1727204152.14484: dumping result to json 16142 1727204152.14492: done dumping result, returning 16142 1727204152.14504: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-fddd-f6c7-000000000917] 16142 1727204152.14515: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000917 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16142 1727204152.14678: no more pending results, returning what we have 16142 1727204152.14683: results queue empty 16142 1727204152.14684: checking for any_errors_fatal 16142 1727204152.14691: done checking for any_errors_fatal 16142 1727204152.14692: checking for max_fail_percentage 16142 1727204152.14694: done checking for max_fail_percentage 16142 1727204152.14695: checking to see if all hosts have failed and the running result is not ok 16142 1727204152.14696: done checking to see if all hosts have failed 16142 1727204152.14697: getting the remaining hosts for this loop 16142 1727204152.14698: done getting the remaining hosts for this loop 16142 1727204152.14702: getting the next task for host managed-node2 16142 1727204152.14716: done getting next task for host managed-node2 16142 1727204152.14720: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204152.14726: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204152.14753: getting variables 16142 1727204152.14755: in VariableManager get_vars() 16142 1727204152.14816: Calling all_inventory to load vars for managed-node2 16142 1727204152.14820: Calling groups_inventory to load vars for managed-node2 16142 1727204152.14822: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204152.14834: Calling all_plugins_play to load vars for managed-node2 16142 1727204152.14841: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204152.14844: Calling groups_plugins_play to load vars for managed-node2 16142 1727204152.16286: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000917 16142 1727204152.16290: WORKER PROCESS EXITING 16142 1727204152.16675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204152.19169: done with get_vars() 16142 1727204152.19202: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.093) 0:00:51.369 ***** 16142 1727204152.19310: entering _queue_task() for managed-node2/service_facts 16142 1727204152.19667: worker is 1 (out of 1 available) 16142 1727204152.19680: exiting _queue_task() for managed-node2/service_facts 16142 1727204152.19691: done queuing things up, now waiting for results queue to drain 16142 1727204152.19693: waiting for pending results... 16142 1727204152.20003: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 16142 1727204152.20203: in run() - task 0affcd87-79f5-fddd-f6c7-000000000919 16142 1727204152.20225: variable 'ansible_search_path' from source: unknown 16142 1727204152.20233: variable 'ansible_search_path' from source: unknown 16142 1727204152.20283: calling self._execute() 16142 1727204152.20406: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204152.20473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204152.20494: variable 'omit' from source: magic vars 16142 1727204152.21305: variable 'ansible_distribution_major_version' from source: facts 16142 1727204152.21457: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204152.21475: variable 'omit' from source: magic vars 16142 1727204152.21577: variable 'omit' from source: magic vars 16142 1727204152.21700: variable 'omit' from source: magic vars 16142 1727204152.21751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204152.21984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204152.22010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204152.22081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204152.22275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204152.22311: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204152.22320: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204152.22327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204152.22548: Set connection var ansible_timeout to 10 16142 1727204152.22585: Set connection var ansible_connection to ssh 16142 1727204152.22594: Set connection var ansible_shell_type to sh 16142 1727204152.22601: Set connection var ansible_shell_executable to /bin/sh 16142 1727204152.22608: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204152.22699: Set connection var ansible_pipelining to False 16142 1727204152.22730: variable 'ansible_shell_executable' from source: unknown 16142 1727204152.22738: variable 'ansible_connection' from source: unknown 16142 1727204152.22746: variable 'ansible_module_compression' from source: unknown 16142 1727204152.22752: variable 'ansible_shell_type' from source: unknown 16142 1727204152.22758: variable 'ansible_shell_executable' from source: unknown 16142 1727204152.22766: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204152.22774: variable 'ansible_pipelining' from source: unknown 16142 1727204152.22803: variable 'ansible_timeout' from source: unknown 16142 1727204152.22812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204152.23460: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204152.23577: variable 'omit' from source: magic vars 16142 1727204152.23677: starting attempt loop 16142 1727204152.23686: running the handler 16142 1727204152.23705: _low_level_execute_command(): starting 16142 1727204152.23717: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204152.25787: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204152.25791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204152.25948: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.25952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204152.25955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.26163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204152.26169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204152.26239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204152.27912: stdout chunk (state=3): >>>/root <<< 16142 1727204152.28010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204152.28101: stderr chunk (state=3): >>><<< 16142 1727204152.28105: stdout chunk (state=3): >>><<< 16142 1727204152.28221: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204152.28225: _low_level_execute_command(): starting 16142 1727204152.28228: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271 `" && echo ansible-tmp-1727204152.2812386-19890-181183565556271="` echo /root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271 `" ) && sleep 0' 16142 1727204152.30069: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204152.30086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204152.30097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204152.30112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204152.30155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204152.30169: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204152.30172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.30198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204152.30206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204152.30214: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204152.30221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204152.30231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204152.30246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204152.30255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204152.30261: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204152.30273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.30353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204152.30381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204152.30394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204152.30479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204152.32390: stdout chunk (state=3): >>>ansible-tmp-1727204152.2812386-19890-181183565556271=/root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271 <<< 16142 1727204152.32593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204152.32597: stdout chunk (state=3): >>><<< 16142 1727204152.32600: stderr chunk (state=3): >>><<< 16142 1727204152.32626: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204152.2812386-19890-181183565556271=/root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204152.32677: variable 'ansible_module_compression' from source: unknown 16142 1727204152.32725: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 16142 1727204152.32769: variable 'ansible_facts' from source: unknown 16142 1727204152.32836: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271/AnsiballZ_service_facts.py 16142 1727204152.33074: Sending initial data 16142 1727204152.33077: Sent initial data (162 bytes) 16142 1727204152.34152: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204152.34158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204152.34208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.34212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204152.34226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204152.34245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204152.34252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.34333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204152.34363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204152.34452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204152.36172: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204152.36212: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204152.36253: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpt6_n8maf /root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271/AnsiballZ_service_facts.py <<< 16142 1727204152.36290: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204152.37637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204152.37736: stderr chunk (state=3): >>><<< 16142 1727204152.37740: stdout chunk (state=3): >>><<< 16142 1727204152.37765: done transferring module to remote 16142 1727204152.37778: _low_level_execute_command(): starting 16142 1727204152.37781: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271/ /root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271/AnsiballZ_service_facts.py && sleep 0' 16142 1727204152.40476: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204152.40480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204152.40524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.40529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204152.40671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204152.40676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.40762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204152.40786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204152.40880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204152.42653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204152.42688: stderr chunk (state=3): >>><<< 16142 1727204152.42691: stdout chunk (state=3): >>><<< 16142 1727204152.42790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204152.42794: _low_level_execute_command(): starting 16142 1727204152.42797: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271/AnsiballZ_service_facts.py && sleep 0' 16142 1727204152.44408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204152.44413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204152.44438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.44442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204152.44725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204152.44824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204152.44828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204152.44894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204153.76036: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 16142 1727204153.76065: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 16142 1727204153.76089: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 16142 1727204153.77313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204153.77395: stderr chunk (state=3): >>><<< 16142 1727204153.77398: stdout chunk (state=3): >>><<< 16142 1727204153.77428: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204153.78422: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204153.78430: _low_level_execute_command(): starting 16142 1727204153.78438: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204152.2812386-19890-181183565556271/ > /dev/null 2>&1 && sleep 0' 16142 1727204153.79084: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204153.79094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204153.79103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204153.79116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204153.79156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204153.79163: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204153.79176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.79189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204153.79198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204153.79206: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204153.79209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204153.79219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204153.79230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204153.79237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204153.79246: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204153.79256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.79330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204153.79347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204153.79357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204153.79432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204153.81208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204153.81259: stderr chunk (state=3): >>><<< 16142 1727204153.81263: stdout chunk (state=3): >>><<< 16142 1727204153.81281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204153.81287: handler run complete 16142 1727204153.81397: variable 'ansible_facts' from source: unknown 16142 1727204153.81500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204153.81744: variable 'ansible_facts' from source: unknown 16142 1727204153.81823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204153.81933: attempt loop complete, returning result 16142 1727204153.81937: _execute() done 16142 1727204153.81942: dumping result to json 16142 1727204153.81977: done dumping result, returning 16142 1727204153.81984: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-fddd-f6c7-000000000919] 16142 1727204153.81989: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000919 16142 1727204153.82747: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000919 16142 1727204153.82750: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204153.82804: no more pending results, returning what we have 16142 1727204153.82807: results queue empty 16142 1727204153.82808: checking for any_errors_fatal 16142 1727204153.82811: done checking for any_errors_fatal 16142 1727204153.82811: checking for max_fail_percentage 16142 1727204153.82813: done checking for max_fail_percentage 16142 1727204153.82813: checking to see if all hosts have failed and the running result is not ok 16142 1727204153.82814: done checking to see if all hosts have failed 16142 1727204153.82815: getting the remaining hosts for this loop 16142 1727204153.82816: done getting the remaining hosts for this loop 16142 1727204153.82818: getting the next task for host managed-node2 16142 1727204153.82822: done getting next task for host managed-node2 16142 1727204153.82825: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204153.82829: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204153.82837: getting variables 16142 1727204153.82839: in VariableManager get_vars() 16142 1727204153.82871: Calling all_inventory to load vars for managed-node2 16142 1727204153.82873: Calling groups_inventory to load vars for managed-node2 16142 1727204153.82878: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204153.82884: Calling all_plugins_play to load vars for managed-node2 16142 1727204153.82886: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204153.82888: Calling groups_plugins_play to load vars for managed-node2 16142 1727204153.84046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204153.85788: done with get_vars() 16142 1727204153.85810: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:53 -0400 (0:00:01.665) 0:00:53.035 ***** 16142 1727204153.85893: entering _queue_task() for managed-node2/package_facts 16142 1727204153.86144: worker is 1 (out of 1 available) 16142 1727204153.86169: exiting _queue_task() for managed-node2/package_facts 16142 1727204153.86180: done queuing things up, now waiting for results queue to drain 16142 1727204153.86182: waiting for pending results... 16142 1727204153.86373: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 16142 1727204153.86487: in run() - task 0affcd87-79f5-fddd-f6c7-00000000091a 16142 1727204153.86499: variable 'ansible_search_path' from source: unknown 16142 1727204153.86505: variable 'ansible_search_path' from source: unknown 16142 1727204153.86536: calling self._execute() 16142 1727204153.86623: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204153.86629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204153.86633: variable 'omit' from source: magic vars 16142 1727204153.86921: variable 'ansible_distribution_major_version' from source: facts 16142 1727204153.86931: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204153.86937: variable 'omit' from source: magic vars 16142 1727204153.87000: variable 'omit' from source: magic vars 16142 1727204153.87025: variable 'omit' from source: magic vars 16142 1727204153.87063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204153.87091: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204153.87109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204153.87122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204153.87132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204153.87159: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204153.87163: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204153.87165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204153.87240: Set connection var ansible_timeout to 10 16142 1727204153.87243: Set connection var ansible_connection to ssh 16142 1727204153.87249: Set connection var ansible_shell_type to sh 16142 1727204153.87254: Set connection var ansible_shell_executable to /bin/sh 16142 1727204153.87261: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204153.87269: Set connection var ansible_pipelining to False 16142 1727204153.87324: variable 'ansible_shell_executable' from source: unknown 16142 1727204153.87327: variable 'ansible_connection' from source: unknown 16142 1727204153.87330: variable 'ansible_module_compression' from source: unknown 16142 1727204153.87332: variable 'ansible_shell_type' from source: unknown 16142 1727204153.87334: variable 'ansible_shell_executable' from source: unknown 16142 1727204153.87336: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204153.87338: variable 'ansible_pipelining' from source: unknown 16142 1727204153.87340: variable 'ansible_timeout' from source: unknown 16142 1727204153.87342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204153.87507: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204153.87511: variable 'omit' from source: magic vars 16142 1727204153.87514: starting attempt loop 16142 1727204153.87517: running the handler 16142 1727204153.87519: _low_level_execute_command(): starting 16142 1727204153.87524: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204153.88063: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204153.88098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.88109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204153.88115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204153.88125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.88188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204153.88197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204153.88265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204153.89827: stdout chunk (state=3): >>>/root <<< 16142 1727204153.89931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204153.89996: stderr chunk (state=3): >>><<< 16142 1727204153.90001: stdout chunk (state=3): >>><<< 16142 1727204153.90013: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204153.90025: _low_level_execute_command(): starting 16142 1727204153.90031: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516 `" && echo ansible-tmp-1727204153.9001215-19960-135070563359516="` echo /root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516 `" ) && sleep 0' 16142 1727204153.90487: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204153.90491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204153.90527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.90542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204153.90544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.90595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204153.90599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204153.90605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204153.90645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204153.92506: stdout chunk (state=3): >>>ansible-tmp-1727204153.9001215-19960-135070563359516=/root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516 <<< 16142 1727204153.92616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204153.92698: stderr chunk (state=3): >>><<< 16142 1727204153.92704: stdout chunk (state=3): >>><<< 16142 1727204153.92728: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204153.9001215-19960-135070563359516=/root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204153.92779: variable 'ansible_module_compression' from source: unknown 16142 1727204153.92833: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 16142 1727204153.92893: variable 'ansible_facts' from source: unknown 16142 1727204153.93085: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516/AnsiballZ_package_facts.py 16142 1727204153.93240: Sending initial data 16142 1727204153.93244: Sent initial data (162 bytes) 16142 1727204153.94200: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204153.94210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204153.94220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204153.94234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204153.94274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204153.94281: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204153.94293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.94305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204153.94313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204153.94320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204153.94328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204153.94339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204153.94348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204153.94356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204153.94363: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204153.94377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.94450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204153.94474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204153.94486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204153.94552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204153.96296: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204153.96326: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204153.96371: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpf20uqjeq /root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516/AnsiballZ_package_facts.py <<< 16142 1727204153.96401: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204153.98793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204153.98885: stderr chunk (state=3): >>><<< 16142 1727204153.98889: stdout chunk (state=3): >>><<< 16142 1727204153.98904: done transferring module to remote 16142 1727204153.98915: _low_level_execute_command(): starting 16142 1727204153.98920: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516/ /root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516/AnsiballZ_package_facts.py && sleep 0' 16142 1727204153.99582: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204153.99590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204153.99601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204153.99615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204153.99658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204153.99665: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204153.99679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.99692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204153.99699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204153.99705: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204153.99716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204153.99727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204153.99739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204153.99742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204153.99755: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204153.99763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204153.99842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204153.99857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204153.99873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204153.99941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204154.01756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204154.01760: stdout chunk (state=3): >>><<< 16142 1727204154.01767: stderr chunk (state=3): >>><<< 16142 1727204154.01794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204154.01798: _low_level_execute_command(): starting 16142 1727204154.01800: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516/AnsiballZ_package_facts.py && sleep 0' 16142 1727204154.02801: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204154.02805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204154.02848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204154.02852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204154.02868: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 16142 1727204154.02880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204154.02885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204154.02974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204154.02987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204154.03068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204154.49120: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gli<<< 16142 1727204154.49147: stdout chunk (state=3): >>>bc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 16142 1727204154.49152: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 16142 1727204154.49169: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 16142 1727204154.49176: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 16142 1727204154.49202: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 16142 1727204154.49237: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 16142 1727204154.49241: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 16142 1727204154.49260: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 16142 1727204154.49280: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 16142 1727204154.49298: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 16142 1727204154.49309: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 16142 1727204154.49340: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 16142 1727204154.49346: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 16142 1727204154.49375: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 16142 1727204154.51060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204154.51305: stderr chunk (state=3): >>><<< 16142 1727204154.51309: stdout chunk (state=3): >>><<< 16142 1727204154.51353: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204154.55041: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204154.55089: _low_level_execute_command(): starting 16142 1727204154.55102: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204153.9001215-19960-135070563359516/ > /dev/null 2>&1 && sleep 0' 16142 1727204154.56605: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204154.56648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204154.56755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204154.56777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204154.56821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204154.56833: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204154.56856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204154.56879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204154.56891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204154.56901: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204154.56913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204154.56926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204154.56942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204154.56954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204154.56976: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204154.56991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204154.57069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204154.57209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204154.57230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204154.57310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204154.59236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204154.59240: stdout chunk (state=3): >>><<< 16142 1727204154.59242: stderr chunk (state=3): >>><<< 16142 1727204154.59269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204154.59573: handler run complete 16142 1727204154.61595: variable 'ansible_facts' from source: unknown 16142 1727204154.62966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204154.67886: variable 'ansible_facts' from source: unknown 16142 1727204154.69866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204154.71957: attempt loop complete, returning result 16142 1727204154.72025: _execute() done 16142 1727204154.72114: dumping result to json 16142 1727204154.72601: done dumping result, returning 16142 1727204154.72618: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-fddd-f6c7-00000000091a] 16142 1727204154.72662: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000091a 16142 1727204154.77825: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000091a 16142 1727204154.77829: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204154.77989: no more pending results, returning what we have 16142 1727204154.77991: results queue empty 16142 1727204154.77992: checking for any_errors_fatal 16142 1727204154.77996: done checking for any_errors_fatal 16142 1727204154.77997: checking for max_fail_percentage 16142 1727204154.77999: done checking for max_fail_percentage 16142 1727204154.77999: checking to see if all hosts have failed and the running result is not ok 16142 1727204154.78000: done checking to see if all hosts have failed 16142 1727204154.78001: getting the remaining hosts for this loop 16142 1727204154.78002: done getting the remaining hosts for this loop 16142 1727204154.78005: getting the next task for host managed-node2 16142 1727204154.78012: done getting next task for host managed-node2 16142 1727204154.78017: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204154.78021: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204154.78034: getting variables 16142 1727204154.78035: in VariableManager get_vars() 16142 1727204154.78082: Calling all_inventory to load vars for managed-node2 16142 1727204154.78086: Calling groups_inventory to load vars for managed-node2 16142 1727204154.78088: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204154.78097: Calling all_plugins_play to load vars for managed-node2 16142 1727204154.78099: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204154.78108: Calling groups_plugins_play to load vars for managed-node2 16142 1727204154.80546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204154.84280: done with get_vars() 16142 1727204154.84321: done getting variables 16142 1727204154.84395: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.985) 0:00:54.021 ***** 16142 1727204154.84446: entering _queue_task() for managed-node2/debug 16142 1727204154.84939: worker is 1 (out of 1 available) 16142 1727204154.84952: exiting _queue_task() for managed-node2/debug 16142 1727204154.84968: done queuing things up, now waiting for results queue to drain 16142 1727204154.84969: waiting for pending results... 16142 1727204154.86222: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 16142 1727204154.86549: in run() - task 0affcd87-79f5-fddd-f6c7-00000000016d 16142 1727204154.86568: variable 'ansible_search_path' from source: unknown 16142 1727204154.86572: variable 'ansible_search_path' from source: unknown 16142 1727204154.86627: calling self._execute() 16142 1727204154.86753: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204154.86756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204154.86771: variable 'omit' from source: magic vars 16142 1727204154.87273: variable 'ansible_distribution_major_version' from source: facts 16142 1727204154.87276: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204154.87279: variable 'omit' from source: magic vars 16142 1727204154.87349: variable 'omit' from source: magic vars 16142 1727204154.87424: variable 'network_provider' from source: set_fact 16142 1727204154.87475: variable 'omit' from source: magic vars 16142 1727204154.87499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204154.87559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204154.87671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204154.87675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204154.87770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204154.87773: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204154.87776: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204154.87779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204154.87838: Set connection var ansible_timeout to 10 16142 1727204154.87842: Set connection var ansible_connection to ssh 16142 1727204154.87844: Set connection var ansible_shell_type to sh 16142 1727204154.87847: Set connection var ansible_shell_executable to /bin/sh 16142 1727204154.87866: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204154.87874: Set connection var ansible_pipelining to False 16142 1727204154.87899: variable 'ansible_shell_executable' from source: unknown 16142 1727204154.87903: variable 'ansible_connection' from source: unknown 16142 1727204154.87906: variable 'ansible_module_compression' from source: unknown 16142 1727204154.87908: variable 'ansible_shell_type' from source: unknown 16142 1727204154.87910: variable 'ansible_shell_executable' from source: unknown 16142 1727204154.87913: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204154.87915: variable 'ansible_pipelining' from source: unknown 16142 1727204154.87917: variable 'ansible_timeout' from source: unknown 16142 1727204154.87921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204154.88080: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204154.88094: variable 'omit' from source: magic vars 16142 1727204154.88099: starting attempt loop 16142 1727204154.88102: running the handler 16142 1727204154.88147: handler run complete 16142 1727204154.88163: attempt loop complete, returning result 16142 1727204154.88168: _execute() done 16142 1727204154.88171: dumping result to json 16142 1727204154.88174: done dumping result, returning 16142 1727204154.88188: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-fddd-f6c7-00000000016d] 16142 1727204154.88198: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000016d 16142 1727204154.88296: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000016d 16142 1727204154.88300: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 16142 1727204154.88382: no more pending results, returning what we have 16142 1727204154.88386: results queue empty 16142 1727204154.88387: checking for any_errors_fatal 16142 1727204154.88399: done checking for any_errors_fatal 16142 1727204154.88400: checking for max_fail_percentage 16142 1727204154.88404: done checking for max_fail_percentage 16142 1727204154.88405: checking to see if all hosts have failed and the running result is not ok 16142 1727204154.88406: done checking to see if all hosts have failed 16142 1727204154.88406: getting the remaining hosts for this loop 16142 1727204154.88408: done getting the remaining hosts for this loop 16142 1727204154.88413: getting the next task for host managed-node2 16142 1727204154.88422: done getting next task for host managed-node2 16142 1727204154.88426: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204154.88437: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204154.88459: getting variables 16142 1727204154.88462: in VariableManager get_vars() 16142 1727204154.88525: Calling all_inventory to load vars for managed-node2 16142 1727204154.88528: Calling groups_inventory to load vars for managed-node2 16142 1727204154.88530: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204154.88541: Calling all_plugins_play to load vars for managed-node2 16142 1727204154.88544: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204154.88548: Calling groups_plugins_play to load vars for managed-node2 16142 1727204154.91985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204154.94705: done with get_vars() 16142 1727204154.94742: done getting variables 16142 1727204154.94899: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.104) 0:00:54.126 ***** 16142 1727204154.94942: entering _queue_task() for managed-node2/fail 16142 1727204154.95326: worker is 1 (out of 1 available) 16142 1727204154.95339: exiting _queue_task() for managed-node2/fail 16142 1727204154.95352: done queuing things up, now waiting for results queue to drain 16142 1727204154.95353: waiting for pending results... 16142 1727204154.96663: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16142 1727204154.96671: in run() - task 0affcd87-79f5-fddd-f6c7-00000000016e 16142 1727204154.96674: variable 'ansible_search_path' from source: unknown 16142 1727204154.96678: variable 'ansible_search_path' from source: unknown 16142 1727204154.96681: calling self._execute() 16142 1727204154.96683: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204154.96690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204154.96693: variable 'omit' from source: magic vars 16142 1727204154.96695: variable 'ansible_distribution_major_version' from source: facts 16142 1727204154.96698: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204154.96853: variable 'network_state' from source: role '' defaults 16142 1727204154.96857: Evaluated conditional (network_state != {}): False 16142 1727204154.96860: when evaluation is False, skipping this task 16142 1727204154.96867: _execute() done 16142 1727204154.96870: dumping result to json 16142 1727204154.96872: done dumping result, returning 16142 1727204154.96875: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-fddd-f6c7-00000000016e] 16142 1727204154.96878: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000016e 16142 1727204154.96988: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000016e 16142 1727204154.96991: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204154.97070: no more pending results, returning what we have 16142 1727204154.97075: results queue empty 16142 1727204154.97076: checking for any_errors_fatal 16142 1727204154.97085: done checking for any_errors_fatal 16142 1727204154.97086: checking for max_fail_percentage 16142 1727204154.97089: done checking for max_fail_percentage 16142 1727204154.97090: checking to see if all hosts have failed and the running result is not ok 16142 1727204154.97091: done checking to see if all hosts have failed 16142 1727204154.97092: getting the remaining hosts for this loop 16142 1727204154.97093: done getting the remaining hosts for this loop 16142 1727204154.97098: getting the next task for host managed-node2 16142 1727204154.97106: done getting next task for host managed-node2 16142 1727204154.97111: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204154.97116: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204154.97146: getting variables 16142 1727204154.97149: in VariableManager get_vars() 16142 1727204154.97223: Calling all_inventory to load vars for managed-node2 16142 1727204154.97227: Calling groups_inventory to load vars for managed-node2 16142 1727204154.97230: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204154.97246: Calling all_plugins_play to load vars for managed-node2 16142 1727204154.97249: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204154.97253: Calling groups_plugins_play to load vars for managed-node2 16142 1727204154.99634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.02209: done with get_vars() 16142 1727204155.02354: done getting variables 16142 1727204155.02435: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.075) 0:00:54.201 ***** 16142 1727204155.02476: entering _queue_task() for managed-node2/fail 16142 1727204155.02869: worker is 1 (out of 1 available) 16142 1727204155.02882: exiting _queue_task() for managed-node2/fail 16142 1727204155.02893: done queuing things up, now waiting for results queue to drain 16142 1727204155.02894: waiting for pending results... 16142 1727204155.03213: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16142 1727204155.03447: in run() - task 0affcd87-79f5-fddd-f6c7-00000000016f 16142 1727204155.03460: variable 'ansible_search_path' from source: unknown 16142 1727204155.03480: variable 'ansible_search_path' from source: unknown 16142 1727204155.03511: calling self._execute() 16142 1727204155.03943: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.03947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.03962: variable 'omit' from source: magic vars 16142 1727204155.04457: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.04476: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.04608: variable 'network_state' from source: role '' defaults 16142 1727204155.04616: Evaluated conditional (network_state != {}): False 16142 1727204155.04620: when evaluation is False, skipping this task 16142 1727204155.04622: _execute() done 16142 1727204155.04626: dumping result to json 16142 1727204155.04628: done dumping result, returning 16142 1727204155.04656: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-fddd-f6c7-00000000016f] 16142 1727204155.04813: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000016f 16142 1727204155.04969: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000016f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204155.05022: no more pending results, returning what we have 16142 1727204155.05026: results queue empty 16142 1727204155.05027: checking for any_errors_fatal 16142 1727204155.05037: done checking for any_errors_fatal 16142 1727204155.05038: checking for max_fail_percentage 16142 1727204155.05041: done checking for max_fail_percentage 16142 1727204155.05042: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.05042: done checking to see if all hosts have failed 16142 1727204155.05043: getting the remaining hosts for this loop 16142 1727204155.05044: done getting the remaining hosts for this loop 16142 1727204155.05048: getting the next task for host managed-node2 16142 1727204155.05055: done getting next task for host managed-node2 16142 1727204155.05060: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204155.05066: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.05097: getting variables 16142 1727204155.05099: in VariableManager get_vars() 16142 1727204155.05155: Calling all_inventory to load vars for managed-node2 16142 1727204155.05158: Calling groups_inventory to load vars for managed-node2 16142 1727204155.05160: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.05173: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.05176: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.05180: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.05700: WORKER PROCESS EXITING 16142 1727204155.06296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.07834: done with get_vars() 16142 1727204155.07857: done getting variables 16142 1727204155.07903: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.054) 0:00:54.256 ***** 16142 1727204155.07930: entering _queue_task() for managed-node2/fail 16142 1727204155.08186: worker is 1 (out of 1 available) 16142 1727204155.08199: exiting _queue_task() for managed-node2/fail 16142 1727204155.08211: done queuing things up, now waiting for results queue to drain 16142 1727204155.08212: waiting for pending results... 16142 1727204155.08413: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16142 1727204155.08520: in run() - task 0affcd87-79f5-fddd-f6c7-000000000170 16142 1727204155.08531: variable 'ansible_search_path' from source: unknown 16142 1727204155.08535: variable 'ansible_search_path' from source: unknown 16142 1727204155.08571: calling self._execute() 16142 1727204155.08645: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.08650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.08659: variable 'omit' from source: magic vars 16142 1727204155.08942: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.08963: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.09089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204155.10756: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204155.10804: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204155.10834: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204155.10862: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204155.10886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204155.10960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.10983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.11001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.11031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.11045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.11120: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.11136: Evaluated conditional (ansible_distribution_major_version | int > 9): False 16142 1727204155.11140: when evaluation is False, skipping this task 16142 1727204155.11143: _execute() done 16142 1727204155.11146: dumping result to json 16142 1727204155.11148: done dumping result, returning 16142 1727204155.11157: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-fddd-f6c7-000000000170] 16142 1727204155.11160: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000170 16142 1727204155.11251: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000170 16142 1727204155.11254: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 16142 1727204155.11309: no more pending results, returning what we have 16142 1727204155.11313: results queue empty 16142 1727204155.11314: checking for any_errors_fatal 16142 1727204155.11320: done checking for any_errors_fatal 16142 1727204155.11320: checking for max_fail_percentage 16142 1727204155.11323: done checking for max_fail_percentage 16142 1727204155.11324: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.11324: done checking to see if all hosts have failed 16142 1727204155.11325: getting the remaining hosts for this loop 16142 1727204155.11326: done getting the remaining hosts for this loop 16142 1727204155.11330: getting the next task for host managed-node2 16142 1727204155.11337: done getting next task for host managed-node2 16142 1727204155.11342: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204155.11347: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.11376: getting variables 16142 1727204155.11378: in VariableManager get_vars() 16142 1727204155.11428: Calling all_inventory to load vars for managed-node2 16142 1727204155.11431: Calling groups_inventory to load vars for managed-node2 16142 1727204155.11433: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.11441: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.11444: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.11446: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.12530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.17731: done with get_vars() 16142 1727204155.17752: done getting variables 16142 1727204155.17791: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.098) 0:00:54.354 ***** 16142 1727204155.17812: entering _queue_task() for managed-node2/dnf 16142 1727204155.18066: worker is 1 (out of 1 available) 16142 1727204155.18080: exiting _queue_task() for managed-node2/dnf 16142 1727204155.18093: done queuing things up, now waiting for results queue to drain 16142 1727204155.18095: waiting for pending results... 16142 1727204155.18282: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16142 1727204155.18393: in run() - task 0affcd87-79f5-fddd-f6c7-000000000171 16142 1727204155.18404: variable 'ansible_search_path' from source: unknown 16142 1727204155.18408: variable 'ansible_search_path' from source: unknown 16142 1727204155.18456: calling self._execute() 16142 1727204155.18522: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.18526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.18535: variable 'omit' from source: magic vars 16142 1727204155.18819: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.18829: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.18968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204155.20604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204155.20659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204155.20689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204155.20716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204155.20739: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204155.20794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.20813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.20834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.20865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.20876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.20953: variable 'ansible_distribution' from source: facts 16142 1727204155.20957: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.20971: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 16142 1727204155.21046: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204155.21130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.21148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.21168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.21197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.21207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.21238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.21253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.21273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.21300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.21311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.21340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.21354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.21372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.21400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.21411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.21513: variable 'network_connections' from source: task vars 16142 1727204155.21523: variable 'controller_profile' from source: play vars 16142 1727204155.21569: variable 'controller_profile' from source: play vars 16142 1727204155.21620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204155.21741: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204155.21768: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204155.21790: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204155.21815: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204155.21846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204155.21862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204155.21885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.21903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204155.21941: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204155.22097: variable 'network_connections' from source: task vars 16142 1727204155.22100: variable 'controller_profile' from source: play vars 16142 1727204155.22143: variable 'controller_profile' from source: play vars 16142 1727204155.22163: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204155.22168: when evaluation is False, skipping this task 16142 1727204155.22171: _execute() done 16142 1727204155.22174: dumping result to json 16142 1727204155.22177: done dumping result, returning 16142 1727204155.22184: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000171] 16142 1727204155.22189: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000171 16142 1727204155.22289: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000171 16142 1727204155.22292: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204155.22340: no more pending results, returning what we have 16142 1727204155.22343: results queue empty 16142 1727204155.22344: checking for any_errors_fatal 16142 1727204155.22352: done checking for any_errors_fatal 16142 1727204155.22357: checking for max_fail_percentage 16142 1727204155.22359: done checking for max_fail_percentage 16142 1727204155.22360: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.22361: done checking to see if all hosts have failed 16142 1727204155.22362: getting the remaining hosts for this loop 16142 1727204155.22363: done getting the remaining hosts for this loop 16142 1727204155.22370: getting the next task for host managed-node2 16142 1727204155.22378: done getting next task for host managed-node2 16142 1727204155.22382: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204155.22385: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.22406: getting variables 16142 1727204155.22408: in VariableManager get_vars() 16142 1727204155.22459: Calling all_inventory to load vars for managed-node2 16142 1727204155.22466: Calling groups_inventory to load vars for managed-node2 16142 1727204155.22469: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.22481: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.22484: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.22487: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.23310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.24260: done with get_vars() 16142 1727204155.24279: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16142 1727204155.24334: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.065) 0:00:54.420 ***** 16142 1727204155.24359: entering _queue_task() for managed-node2/yum 16142 1727204155.24593: worker is 1 (out of 1 available) 16142 1727204155.24606: exiting _queue_task() for managed-node2/yum 16142 1727204155.24617: done queuing things up, now waiting for results queue to drain 16142 1727204155.24619: waiting for pending results... 16142 1727204155.24812: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16142 1727204155.24912: in run() - task 0affcd87-79f5-fddd-f6c7-000000000172 16142 1727204155.24923: variable 'ansible_search_path' from source: unknown 16142 1727204155.24926: variable 'ansible_search_path' from source: unknown 16142 1727204155.24960: calling self._execute() 16142 1727204155.25039: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.25047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.25057: variable 'omit' from source: magic vars 16142 1727204155.25346: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.25356: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.25483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204155.27144: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204155.27200: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204155.27229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204155.27261: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204155.27282: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204155.27340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.27365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.27384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.27410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.27421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.27498: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.27511: Evaluated conditional (ansible_distribution_major_version | int < 8): False 16142 1727204155.27514: when evaluation is False, skipping this task 16142 1727204155.27517: _execute() done 16142 1727204155.27520: dumping result to json 16142 1727204155.27523: done dumping result, returning 16142 1727204155.27531: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000172] 16142 1727204155.27536: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000172 16142 1727204155.27633: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000172 16142 1727204155.27636: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 16142 1727204155.27692: no more pending results, returning what we have 16142 1727204155.27696: results queue empty 16142 1727204155.27697: checking for any_errors_fatal 16142 1727204155.27703: done checking for any_errors_fatal 16142 1727204155.27704: checking for max_fail_percentage 16142 1727204155.27706: done checking for max_fail_percentage 16142 1727204155.27707: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.27707: done checking to see if all hosts have failed 16142 1727204155.27708: getting the remaining hosts for this loop 16142 1727204155.27709: done getting the remaining hosts for this loop 16142 1727204155.27713: getting the next task for host managed-node2 16142 1727204155.27722: done getting next task for host managed-node2 16142 1727204155.27725: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204155.27729: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.27753: getting variables 16142 1727204155.27755: in VariableManager get_vars() 16142 1727204155.27813: Calling all_inventory to load vars for managed-node2 16142 1727204155.27816: Calling groups_inventory to load vars for managed-node2 16142 1727204155.27818: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.27827: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.27829: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.27832: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.28817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.29740: done with get_vars() 16142 1727204155.29758: done getting variables 16142 1727204155.29802: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.054) 0:00:54.475 ***** 16142 1727204155.29827: entering _queue_task() for managed-node2/fail 16142 1727204155.30083: worker is 1 (out of 1 available) 16142 1727204155.30097: exiting _queue_task() for managed-node2/fail 16142 1727204155.30110: done queuing things up, now waiting for results queue to drain 16142 1727204155.30111: waiting for pending results... 16142 1727204155.30317: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16142 1727204155.30433: in run() - task 0affcd87-79f5-fddd-f6c7-000000000173 16142 1727204155.30445: variable 'ansible_search_path' from source: unknown 16142 1727204155.30449: variable 'ansible_search_path' from source: unknown 16142 1727204155.30482: calling self._execute() 16142 1727204155.30565: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.30571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.30579: variable 'omit' from source: magic vars 16142 1727204155.30861: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.30873: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.30957: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204155.31098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204155.32726: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204155.32783: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204155.32813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204155.32842: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204155.32861: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204155.32922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.32945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.32967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.32993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.33004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.33041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.33057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.33078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.33103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.33113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.33144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.33162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.33181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.33221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.33229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.33352: variable 'network_connections' from source: task vars 16142 1727204155.33361: variable 'controller_profile' from source: play vars 16142 1727204155.33414: variable 'controller_profile' from source: play vars 16142 1727204155.33469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204155.33580: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204155.33619: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204155.33642: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204155.33665: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204155.33699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204155.33714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204155.33731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.33750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204155.33789: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204155.33947: variable 'network_connections' from source: task vars 16142 1727204155.33950: variable 'controller_profile' from source: play vars 16142 1727204155.33996: variable 'controller_profile' from source: play vars 16142 1727204155.34021: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204155.34025: when evaluation is False, skipping this task 16142 1727204155.34029: _execute() done 16142 1727204155.34032: dumping result to json 16142 1727204155.34034: done dumping result, returning 16142 1727204155.34039: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000173] 16142 1727204155.34042: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000173 16142 1727204155.34141: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000173 16142 1727204155.34144: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204155.34208: no more pending results, returning what we have 16142 1727204155.34212: results queue empty 16142 1727204155.34213: checking for any_errors_fatal 16142 1727204155.34223: done checking for any_errors_fatal 16142 1727204155.34223: checking for max_fail_percentage 16142 1727204155.34226: done checking for max_fail_percentage 16142 1727204155.34226: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.34227: done checking to see if all hosts have failed 16142 1727204155.34228: getting the remaining hosts for this loop 16142 1727204155.34229: done getting the remaining hosts for this loop 16142 1727204155.34238: getting the next task for host managed-node2 16142 1727204155.34246: done getting next task for host managed-node2 16142 1727204155.34250: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 16142 1727204155.34254: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.34277: getting variables 16142 1727204155.34279: in VariableManager get_vars() 16142 1727204155.34329: Calling all_inventory to load vars for managed-node2 16142 1727204155.34332: Calling groups_inventory to load vars for managed-node2 16142 1727204155.34334: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.34349: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.34352: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.34354: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.35194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.36144: done with get_vars() 16142 1727204155.36165: done getting variables 16142 1727204155.36211: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.064) 0:00:54.539 ***** 16142 1727204155.36240: entering _queue_task() for managed-node2/package 16142 1727204155.36495: worker is 1 (out of 1 available) 16142 1727204155.36509: exiting _queue_task() for managed-node2/package 16142 1727204155.36522: done queuing things up, now waiting for results queue to drain 16142 1727204155.36523: waiting for pending results... 16142 1727204155.36722: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 16142 1727204155.36823: in run() - task 0affcd87-79f5-fddd-f6c7-000000000174 16142 1727204155.36839: variable 'ansible_search_path' from source: unknown 16142 1727204155.36843: variable 'ansible_search_path' from source: unknown 16142 1727204155.36873: calling self._execute() 16142 1727204155.36954: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.36958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.36970: variable 'omit' from source: magic vars 16142 1727204155.37251: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.37262: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.37411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204155.37616: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204155.37650: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204155.37678: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204155.37739: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204155.37813: variable 'network_packages' from source: role '' defaults 16142 1727204155.37887: variable '__network_provider_setup' from source: role '' defaults 16142 1727204155.37896: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204155.37959: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204155.37962: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204155.38000: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204155.38120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204155.40835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204155.40922: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204155.40981: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204155.41019: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204155.41056: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204155.41153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.41202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.41235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.41301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.41321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.41377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.41417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.41450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.41514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.41538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.41802: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204155.41948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.41998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.42034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.42092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.42111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.42220: variable 'ansible_python' from source: facts 16142 1727204155.42254: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204155.42354: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204155.42457: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204155.42577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.42596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.42619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.42645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.42656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.42691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.42715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.42739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.42773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.42784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.42887: variable 'network_connections' from source: task vars 16142 1727204155.42892: variable 'controller_profile' from source: play vars 16142 1727204155.42967: variable 'controller_profile' from source: play vars 16142 1727204155.43017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204155.43042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204155.43062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.43084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204155.43119: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204155.43322: variable 'network_connections' from source: task vars 16142 1727204155.43326: variable 'controller_profile' from source: play vars 16142 1727204155.43399: variable 'controller_profile' from source: play vars 16142 1727204155.43423: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204155.43481: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204155.43681: variable 'network_connections' from source: task vars 16142 1727204155.43685: variable 'controller_profile' from source: play vars 16142 1727204155.43730: variable 'controller_profile' from source: play vars 16142 1727204155.43747: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204155.43804: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204155.43998: variable 'network_connections' from source: task vars 16142 1727204155.44003: variable 'controller_profile' from source: play vars 16142 1727204155.44051: variable 'controller_profile' from source: play vars 16142 1727204155.44090: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204155.44132: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204155.44141: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204155.44183: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204155.44325: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204155.44646: variable 'network_connections' from source: task vars 16142 1727204155.44649: variable 'controller_profile' from source: play vars 16142 1727204155.44706: variable 'controller_profile' from source: play vars 16142 1727204155.44715: variable 'ansible_distribution' from source: facts 16142 1727204155.44718: variable '__network_rh_distros' from source: role '' defaults 16142 1727204155.44726: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.44741: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204155.44896: variable 'ansible_distribution' from source: facts 16142 1727204155.44899: variable '__network_rh_distros' from source: role '' defaults 16142 1727204155.44905: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.44917: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204155.45072: variable 'ansible_distribution' from source: facts 16142 1727204155.45075: variable '__network_rh_distros' from source: role '' defaults 16142 1727204155.45080: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.45109: variable 'network_provider' from source: set_fact 16142 1727204155.45119: variable 'ansible_facts' from source: unknown 16142 1727204155.45513: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 16142 1727204155.45517: when evaluation is False, skipping this task 16142 1727204155.45519: _execute() done 16142 1727204155.45522: dumping result to json 16142 1727204155.45525: done dumping result, returning 16142 1727204155.45535: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-fddd-f6c7-000000000174] 16142 1727204155.45540: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000174 16142 1727204155.45631: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000174 16142 1727204155.45635: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 16142 1727204155.45700: no more pending results, returning what we have 16142 1727204155.45704: results queue empty 16142 1727204155.45705: checking for any_errors_fatal 16142 1727204155.45711: done checking for any_errors_fatal 16142 1727204155.45711: checking for max_fail_percentage 16142 1727204155.45713: done checking for max_fail_percentage 16142 1727204155.45714: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.45715: done checking to see if all hosts have failed 16142 1727204155.45715: getting the remaining hosts for this loop 16142 1727204155.45717: done getting the remaining hosts for this loop 16142 1727204155.45721: getting the next task for host managed-node2 16142 1727204155.45728: done getting next task for host managed-node2 16142 1727204155.45732: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204155.45738: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.45761: getting variables 16142 1727204155.45763: in VariableManager get_vars() 16142 1727204155.45818: Calling all_inventory to load vars for managed-node2 16142 1727204155.45821: Calling groups_inventory to load vars for managed-node2 16142 1727204155.45823: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.45839: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.45842: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.45845: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.47166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.48088: done with get_vars() 16142 1727204155.48106: done getting variables 16142 1727204155.48155: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.119) 0:00:54.658 ***** 16142 1727204155.48184: entering _queue_task() for managed-node2/package 16142 1727204155.48434: worker is 1 (out of 1 available) 16142 1727204155.48451: exiting _queue_task() for managed-node2/package 16142 1727204155.48463: done queuing things up, now waiting for results queue to drain 16142 1727204155.48466: waiting for pending results... 16142 1727204155.48653: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16142 1727204155.48767: in run() - task 0affcd87-79f5-fddd-f6c7-000000000175 16142 1727204155.48780: variable 'ansible_search_path' from source: unknown 16142 1727204155.48783: variable 'ansible_search_path' from source: unknown 16142 1727204155.48817: calling self._execute() 16142 1727204155.48898: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.48904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.48911: variable 'omit' from source: magic vars 16142 1727204155.49194: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.49204: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.49293: variable 'network_state' from source: role '' defaults 16142 1727204155.49301: Evaluated conditional (network_state != {}): False 16142 1727204155.49304: when evaluation is False, skipping this task 16142 1727204155.49307: _execute() done 16142 1727204155.49310: dumping result to json 16142 1727204155.49313: done dumping result, returning 16142 1727204155.49320: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-000000000175] 16142 1727204155.49327: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000175 16142 1727204155.49426: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000175 16142 1727204155.49428: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204155.49501: no more pending results, returning what we have 16142 1727204155.49506: results queue empty 16142 1727204155.49506: checking for any_errors_fatal 16142 1727204155.49513: done checking for any_errors_fatal 16142 1727204155.49514: checking for max_fail_percentage 16142 1727204155.49515: done checking for max_fail_percentage 16142 1727204155.49516: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.49517: done checking to see if all hosts have failed 16142 1727204155.49518: getting the remaining hosts for this loop 16142 1727204155.49519: done getting the remaining hosts for this loop 16142 1727204155.49522: getting the next task for host managed-node2 16142 1727204155.49528: done getting next task for host managed-node2 16142 1727204155.49534: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204155.49539: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.49558: getting variables 16142 1727204155.49560: in VariableManager get_vars() 16142 1727204155.49613: Calling all_inventory to load vars for managed-node2 16142 1727204155.49616: Calling groups_inventory to load vars for managed-node2 16142 1727204155.49618: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.49627: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.49629: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.49632: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.50443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.51410: done with get_vars() 16142 1727204155.51430: done getting variables 16142 1727204155.51478: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.033) 0:00:54.691 ***** 16142 1727204155.51504: entering _queue_task() for managed-node2/package 16142 1727204155.51754: worker is 1 (out of 1 available) 16142 1727204155.51769: exiting _queue_task() for managed-node2/package 16142 1727204155.51786: done queuing things up, now waiting for results queue to drain 16142 1727204155.51788: waiting for pending results... 16142 1727204155.51996: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16142 1727204155.52108: in run() - task 0affcd87-79f5-fddd-f6c7-000000000176 16142 1727204155.52120: variable 'ansible_search_path' from source: unknown 16142 1727204155.52123: variable 'ansible_search_path' from source: unknown 16142 1727204155.52155: calling self._execute() 16142 1727204155.52243: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.52247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.52255: variable 'omit' from source: magic vars 16142 1727204155.52543: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.52552: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.52639: variable 'network_state' from source: role '' defaults 16142 1727204155.52646: Evaluated conditional (network_state != {}): False 16142 1727204155.52649: when evaluation is False, skipping this task 16142 1727204155.52652: _execute() done 16142 1727204155.52655: dumping result to json 16142 1727204155.52659: done dumping result, returning 16142 1727204155.52667: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-fddd-f6c7-000000000176] 16142 1727204155.52674: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000176 16142 1727204155.52770: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000176 16142 1727204155.52772: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204155.52823: no more pending results, returning what we have 16142 1727204155.52827: results queue empty 16142 1727204155.52828: checking for any_errors_fatal 16142 1727204155.52834: done checking for any_errors_fatal 16142 1727204155.52835: checking for max_fail_percentage 16142 1727204155.52839: done checking for max_fail_percentage 16142 1727204155.52840: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.52841: done checking to see if all hosts have failed 16142 1727204155.52841: getting the remaining hosts for this loop 16142 1727204155.52843: done getting the remaining hosts for this loop 16142 1727204155.52846: getting the next task for host managed-node2 16142 1727204155.52854: done getting next task for host managed-node2 16142 1727204155.52858: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204155.52861: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.52895: getting variables 16142 1727204155.52898: in VariableManager get_vars() 16142 1727204155.52948: Calling all_inventory to load vars for managed-node2 16142 1727204155.52951: Calling groups_inventory to load vars for managed-node2 16142 1727204155.52953: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.52962: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.52967: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.52970: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.54539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.55967: done with get_vars() 16142 1727204155.55998: done getting variables 16142 1727204155.56068: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.045) 0:00:54.737 ***** 16142 1727204155.56106: entering _queue_task() for managed-node2/service 16142 1727204155.56472: worker is 1 (out of 1 available) 16142 1727204155.56484: exiting _queue_task() for managed-node2/service 16142 1727204155.56496: done queuing things up, now waiting for results queue to drain 16142 1727204155.56497: waiting for pending results... 16142 1727204155.56803: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16142 1727204155.56990: in run() - task 0affcd87-79f5-fddd-f6c7-000000000177 16142 1727204155.57010: variable 'ansible_search_path' from source: unknown 16142 1727204155.57018: variable 'ansible_search_path' from source: unknown 16142 1727204155.57069: calling self._execute() 16142 1727204155.57185: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.57203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.57219: variable 'omit' from source: magic vars 16142 1727204155.57508: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.57519: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.57607: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204155.57816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204155.60319: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204155.60396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204155.60434: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204155.60473: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204155.60501: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204155.60577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.60648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.60654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.60677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.60688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.60732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.60757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.60785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.60823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.60837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.60881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.60904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.60924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.60967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.60979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.62216: variable 'network_connections' from source: task vars 16142 1727204155.62229: variable 'controller_profile' from source: play vars 16142 1727204155.62299: variable 'controller_profile' from source: play vars 16142 1727204155.62368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204155.62646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204155.63100: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204155.63130: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204155.63163: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204155.63213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204155.63234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204155.63262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.63288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204155.63335: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204155.63637: variable 'network_connections' from source: task vars 16142 1727204155.63645: variable 'controller_profile' from source: play vars 16142 1727204155.63712: variable 'controller_profile' from source: play vars 16142 1727204155.63737: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16142 1727204155.63743: when evaluation is False, skipping this task 16142 1727204155.63745: _execute() done 16142 1727204155.63748: dumping result to json 16142 1727204155.63753: done dumping result, returning 16142 1727204155.63761: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-fddd-f6c7-000000000177] 16142 1727204155.63769: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000177 16142 1727204155.63868: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000177 16142 1727204155.63880: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16142 1727204155.63944: no more pending results, returning what we have 16142 1727204155.63949: results queue empty 16142 1727204155.63950: checking for any_errors_fatal 16142 1727204155.63959: done checking for any_errors_fatal 16142 1727204155.63960: checking for max_fail_percentage 16142 1727204155.63962: done checking for max_fail_percentage 16142 1727204155.63963: checking to see if all hosts have failed and the running result is not ok 16142 1727204155.63966: done checking to see if all hosts have failed 16142 1727204155.63967: getting the remaining hosts for this loop 16142 1727204155.63968: done getting the remaining hosts for this loop 16142 1727204155.63973: getting the next task for host managed-node2 16142 1727204155.63980: done getting next task for host managed-node2 16142 1727204155.63984: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204155.63988: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204155.64010: getting variables 16142 1727204155.64012: in VariableManager get_vars() 16142 1727204155.64069: Calling all_inventory to load vars for managed-node2 16142 1727204155.64072: Calling groups_inventory to load vars for managed-node2 16142 1727204155.64074: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204155.64084: Calling all_plugins_play to load vars for managed-node2 16142 1727204155.64086: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204155.64089: Calling groups_plugins_play to load vars for managed-node2 16142 1727204155.66546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204155.71802: done with get_vars() 16142 1727204155.71838: done getting variables 16142 1727204155.71909: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.158) 0:00:54.896 ***** 16142 1727204155.71945: entering _queue_task() for managed-node2/service 16142 1727204155.72311: worker is 1 (out of 1 available) 16142 1727204155.72324: exiting _queue_task() for managed-node2/service 16142 1727204155.72337: done queuing things up, now waiting for results queue to drain 16142 1727204155.72338: waiting for pending results... 16142 1727204155.73292: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16142 1727204155.73632: in run() - task 0affcd87-79f5-fddd-f6c7-000000000178 16142 1727204155.73657: variable 'ansible_search_path' from source: unknown 16142 1727204155.73660: variable 'ansible_search_path' from source: unknown 16142 1727204155.73905: calling self._execute() 16142 1727204155.74020: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.74024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.74036: variable 'omit' from source: magic vars 16142 1727204155.74825: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.74841: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204155.75109: variable 'network_provider' from source: set_fact 16142 1727204155.75113: variable 'network_state' from source: role '' defaults 16142 1727204155.75124: Evaluated conditional (network_provider == "nm" or network_state != {}): True 16142 1727204155.75132: variable 'omit' from source: magic vars 16142 1727204155.75477: variable 'omit' from source: magic vars 16142 1727204155.75508: variable 'network_service_name' from source: role '' defaults 16142 1727204155.75579: variable 'network_service_name' from source: role '' defaults 16142 1727204155.75879: variable '__network_provider_setup' from source: role '' defaults 16142 1727204155.75885: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204155.75945: variable '__network_service_name_default_nm' from source: role '' defaults 16142 1727204155.75953: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204155.76217: variable '__network_packages_default_nm' from source: role '' defaults 16142 1727204155.76441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204155.81514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204155.81802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204155.81851: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204155.82194: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204155.82230: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204155.82317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.82354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.82593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.82641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.82663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.82717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.82748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.82782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.82828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.82988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.83282: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16142 1727204155.83517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.83599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.83727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.83774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.83794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.84037: variable 'ansible_python' from source: facts 16142 1727204155.84069: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16142 1727204155.84225: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204155.84507: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204155.84750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.84816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.84924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.85007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.85029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.85130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204155.85246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204155.85280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.85325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204155.85408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204155.86711: variable 'network_connections' from source: task vars 16142 1727204155.86885: variable 'controller_profile' from source: play vars 16142 1727204155.86982: variable 'controller_profile' from source: play vars 16142 1727204155.87301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204155.87637: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204155.87925: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204155.87979: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204155.88028: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204155.88336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204155.88376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204155.88412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204155.88452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204155.88507: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204155.89045: variable 'network_connections' from source: task vars 16142 1727204155.89280: variable 'controller_profile' from source: play vars 16142 1727204155.89360: variable 'controller_profile' from source: play vars 16142 1727204155.89400: variable '__network_packages_default_wireless' from source: role '' defaults 16142 1727204155.89689: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204155.90295: variable 'network_connections' from source: task vars 16142 1727204155.90306: variable 'controller_profile' from source: play vars 16142 1727204155.90384: variable 'controller_profile' from source: play vars 16142 1727204155.90497: variable '__network_packages_default_team' from source: role '' defaults 16142 1727204155.90585: variable '__network_team_connections_defined' from source: role '' defaults 16142 1727204155.90963: variable 'network_connections' from source: task vars 16142 1727204155.91480: variable 'controller_profile' from source: play vars 16142 1727204155.91568: variable 'controller_profile' from source: play vars 16142 1727204155.91634: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204155.91833: variable '__network_service_name_default_initscripts' from source: role '' defaults 16142 1727204155.91846: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204155.91914: variable '__network_packages_default_initscripts' from source: role '' defaults 16142 1727204155.92590: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16142 1727204155.93515: variable 'network_connections' from source: task vars 16142 1727204155.93578: variable 'controller_profile' from source: play vars 16142 1727204155.93861: variable 'controller_profile' from source: play vars 16142 1727204155.93879: variable 'ansible_distribution' from source: facts 16142 1727204155.93887: variable '__network_rh_distros' from source: role '' defaults 16142 1727204155.93899: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.93918: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16142 1727204155.94098: variable 'ansible_distribution' from source: facts 16142 1727204155.94106: variable '__network_rh_distros' from source: role '' defaults 16142 1727204155.94116: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.94132: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16142 1727204155.94506: variable 'ansible_distribution' from source: facts 16142 1727204155.94515: variable '__network_rh_distros' from source: role '' defaults 16142 1727204155.94524: variable 'ansible_distribution_major_version' from source: facts 16142 1727204155.94569: variable 'network_provider' from source: set_fact 16142 1727204155.94595: variable 'omit' from source: magic vars 16142 1727204155.94625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204155.94655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204155.94681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204155.94702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204155.94715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204155.94746: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204155.94754: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.94761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.94867: Set connection var ansible_timeout to 10 16142 1727204155.94876: Set connection var ansible_connection to ssh 16142 1727204155.94886: Set connection var ansible_shell_type to sh 16142 1727204155.94895: Set connection var ansible_shell_executable to /bin/sh 16142 1727204155.94904: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204155.94914: Set connection var ansible_pipelining to False 16142 1727204155.94944: variable 'ansible_shell_executable' from source: unknown 16142 1727204155.94951: variable 'ansible_connection' from source: unknown 16142 1727204155.94957: variable 'ansible_module_compression' from source: unknown 16142 1727204155.94962: variable 'ansible_shell_type' from source: unknown 16142 1727204155.94975: variable 'ansible_shell_executable' from source: unknown 16142 1727204155.94981: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204155.94989: variable 'ansible_pipelining' from source: unknown 16142 1727204155.94996: variable 'ansible_timeout' from source: unknown 16142 1727204155.95003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204155.95168: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204155.95279: variable 'omit' from source: magic vars 16142 1727204155.95292: starting attempt loop 16142 1727204155.95298: running the handler 16142 1727204155.95379: variable 'ansible_facts' from source: unknown 16142 1727204155.96578: _low_level_execute_command(): starting 16142 1727204155.96825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204155.98090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204155.98110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204155.98125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204155.98144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204155.98190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204155.98203: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204155.98218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204155.98286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204155.98298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204155.98308: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204155.98319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204155.98332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204155.98347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204155.98358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204155.98372: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204155.98385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204155.98471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204155.98496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204155.98516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204155.98595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.00269: stdout chunk (state=3): >>>/root <<< 16142 1727204156.00460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.00467: stdout chunk (state=3): >>><<< 16142 1727204156.00470: stderr chunk (state=3): >>><<< 16142 1727204156.00593: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204156.00599: _low_level_execute_command(): starting 16142 1727204156.00602: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732 `" && echo ansible-tmp-1727204156.0049443-20129-30015095981732="` echo /root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732 `" ) && sleep 0' 16142 1727204156.01629: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.01634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.01667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.01671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.01673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.01863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204156.01886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.01986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.03862: stdout chunk (state=3): >>>ansible-tmp-1727204156.0049443-20129-30015095981732=/root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732 <<< 16142 1727204156.04065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.04069: stdout chunk (state=3): >>><<< 16142 1727204156.04079: stderr chunk (state=3): >>><<< 16142 1727204156.04098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204156.0049443-20129-30015095981732=/root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204156.04133: variable 'ansible_module_compression' from source: unknown 16142 1727204156.04202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 16142 1727204156.04274: variable 'ansible_facts' from source: unknown 16142 1727204156.04486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732/AnsiballZ_systemd.py 16142 1727204156.04860: Sending initial data 16142 1727204156.04865: Sent initial data (155 bytes) 16142 1727204156.06321: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204156.06329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.06342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.06357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.06410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204156.06413: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204156.06417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.06430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204156.06437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204156.06461: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204156.06466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.06468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.06492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.06495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204156.06498: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204156.06504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.06583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.06601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204156.06608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.06680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.08406: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204156.08443: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204156.08526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpfoctscsr /root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732/AnsiballZ_systemd.py <<< 16142 1727204156.08541: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204156.11850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.11854: stderr chunk (state=3): >>><<< 16142 1727204156.11856: stdout chunk (state=3): >>><<< 16142 1727204156.11858: done transferring module to remote 16142 1727204156.11860: _low_level_execute_command(): starting 16142 1727204156.11863: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732/ /root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732/AnsiballZ_systemd.py && sleep 0' 16142 1727204156.12470: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204156.12479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.12489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.12504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.12550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204156.12565: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204156.12732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.12738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204156.12763: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204156.12769: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204156.12772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.12797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.12904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.12908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204156.12912: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204156.12915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.12918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.12920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204156.12923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.12926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.14584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.14614: stderr chunk (state=3): >>><<< 16142 1727204156.14618: stdout chunk (state=3): >>><<< 16142 1727204156.14635: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204156.14642: _low_level_execute_command(): starting 16142 1727204156.14645: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732/AnsiballZ_systemd.py && sleep 0' 16142 1727204156.16034: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204156.16072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.16082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.16096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.16133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204156.16141: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204156.16153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.16173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204156.16180: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204156.16187: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204156.16194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.16203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.16213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.16220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204156.16226: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204156.16235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.16306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.16321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204156.16324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.16407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.41598: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 16142 1727204156.41622: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6995968", "MemoryAvailable": "infinity", "CPUUsageNSec": "1261374000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 16142 1727204156.43238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204156.43242: stdout chunk (state=3): >>><<< 16142 1727204156.43244: stderr chunk (state=3): >>><<< 16142 1727204156.43269: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6995968", "MemoryAvailable": "infinity", "CPUUsageNSec": "1261374000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204156.43543: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204156.43547: _low_level_execute_command(): starting 16142 1727204156.43549: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204156.0049443-20129-30015095981732/ > /dev/null 2>&1 && sleep 0' 16142 1727204156.44996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204156.45013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.45028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.45046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.45102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204156.45113: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204156.45127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.45143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204156.45154: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204156.45166: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204156.45182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.45196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.45211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.45223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204156.45233: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204156.45247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.45321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.45349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204156.45368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.45437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.47230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.47329: stderr chunk (state=3): >>><<< 16142 1727204156.47339: stdout chunk (state=3): >>><<< 16142 1727204156.47674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204156.47677: handler run complete 16142 1727204156.47680: attempt loop complete, returning result 16142 1727204156.47682: _execute() done 16142 1727204156.47684: dumping result to json 16142 1727204156.47686: done dumping result, returning 16142 1727204156.47688: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-fddd-f6c7-000000000178] 16142 1727204156.47690: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000178 16142 1727204156.47849: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000178 16142 1727204156.47852: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204156.47914: no more pending results, returning what we have 16142 1727204156.47918: results queue empty 16142 1727204156.47919: checking for any_errors_fatal 16142 1727204156.47926: done checking for any_errors_fatal 16142 1727204156.47927: checking for max_fail_percentage 16142 1727204156.47929: done checking for max_fail_percentage 16142 1727204156.47930: checking to see if all hosts have failed and the running result is not ok 16142 1727204156.47930: done checking to see if all hosts have failed 16142 1727204156.47931: getting the remaining hosts for this loop 16142 1727204156.47933: done getting the remaining hosts for this loop 16142 1727204156.47936: getting the next task for host managed-node2 16142 1727204156.47943: done getting next task for host managed-node2 16142 1727204156.47948: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204156.47952: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204156.47974: getting variables 16142 1727204156.47976: in VariableManager get_vars() 16142 1727204156.48029: Calling all_inventory to load vars for managed-node2 16142 1727204156.48032: Calling groups_inventory to load vars for managed-node2 16142 1727204156.48035: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204156.48046: Calling all_plugins_play to load vars for managed-node2 16142 1727204156.48049: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204156.48052: Calling groups_plugins_play to load vars for managed-node2 16142 1727204156.49947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204156.52130: done with get_vars() 16142 1727204156.52168: done getting variables 16142 1727204156.52229: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.803) 0:00:55.699 ***** 16142 1727204156.52273: entering _queue_task() for managed-node2/service 16142 1727204156.52677: worker is 1 (out of 1 available) 16142 1727204156.52689: exiting _queue_task() for managed-node2/service 16142 1727204156.52705: done queuing things up, now waiting for results queue to drain 16142 1727204156.52707: waiting for pending results... 16142 1727204156.53553: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16142 1727204156.53671: in run() - task 0affcd87-79f5-fddd-f6c7-000000000179 16142 1727204156.53696: variable 'ansible_search_path' from source: unknown 16142 1727204156.53700: variable 'ansible_search_path' from source: unknown 16142 1727204156.53731: calling self._execute() 16142 1727204156.53815: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204156.53820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204156.53828: variable 'omit' from source: magic vars 16142 1727204156.54117: variable 'ansible_distribution_major_version' from source: facts 16142 1727204156.54129: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204156.54215: variable 'network_provider' from source: set_fact 16142 1727204156.54220: Evaluated conditional (network_provider == "nm"): True 16142 1727204156.54287: variable '__network_wpa_supplicant_required' from source: role '' defaults 16142 1727204156.54352: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16142 1727204156.54478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204156.56817: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204156.56866: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204156.56896: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204156.56922: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204156.56945: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204156.57004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204156.57024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204156.57044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204156.57074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204156.57086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204156.57118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204156.57134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204156.57153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204156.57182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204156.57192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204156.57221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204156.57239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204156.57254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204156.57283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204156.57293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204156.57392: variable 'network_connections' from source: task vars 16142 1727204156.57401: variable 'controller_profile' from source: play vars 16142 1727204156.57453: variable 'controller_profile' from source: play vars 16142 1727204156.57506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16142 1727204156.57634: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16142 1727204156.57665: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16142 1727204156.57687: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16142 1727204156.57711: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16142 1727204156.57742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16142 1727204156.57758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16142 1727204156.57777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204156.57794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16142 1727204156.57833: variable '__network_wireless_connections_defined' from source: role '' defaults 16142 1727204156.57988: variable 'network_connections' from source: task vars 16142 1727204156.57992: variable 'controller_profile' from source: play vars 16142 1727204156.58040: variable 'controller_profile' from source: play vars 16142 1727204156.58060: Evaluated conditional (__network_wpa_supplicant_required): False 16142 1727204156.58065: when evaluation is False, skipping this task 16142 1727204156.58067: _execute() done 16142 1727204156.58070: dumping result to json 16142 1727204156.58077: done dumping result, returning 16142 1727204156.58083: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-fddd-f6c7-000000000179] 16142 1727204156.58094: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000179 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 16142 1727204156.58228: no more pending results, returning what we have 16142 1727204156.58231: results queue empty 16142 1727204156.58234: checking for any_errors_fatal 16142 1727204156.58257: done checking for any_errors_fatal 16142 1727204156.58258: checking for max_fail_percentage 16142 1727204156.58260: done checking for max_fail_percentage 16142 1727204156.58260: checking to see if all hosts have failed and the running result is not ok 16142 1727204156.58261: done checking to see if all hosts have failed 16142 1727204156.58262: getting the remaining hosts for this loop 16142 1727204156.58264: done getting the remaining hosts for this loop 16142 1727204156.58268: getting the next task for host managed-node2 16142 1727204156.58276: done getting next task for host managed-node2 16142 1727204156.58280: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204156.58284: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204156.58339: getting variables 16142 1727204156.58342: in VariableManager get_vars() 16142 1727204156.58391: Calling all_inventory to load vars for managed-node2 16142 1727204156.58397: Calling groups_inventory to load vars for managed-node2 16142 1727204156.58427: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204156.58440: Calling all_plugins_play to load vars for managed-node2 16142 1727204156.58443: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204156.58446: Calling groups_plugins_play to load vars for managed-node2 16142 1727204156.59005: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000179 16142 1727204156.59007: WORKER PROCESS EXITING 16142 1727204156.60001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204156.61156: done with get_vars() 16142 1727204156.61180: done getting variables 16142 1727204156.61228: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.089) 0:00:55.789 ***** 16142 1727204156.61254: entering _queue_task() for managed-node2/service 16142 1727204156.61501: worker is 1 (out of 1 available) 16142 1727204156.61514: exiting _queue_task() for managed-node2/service 16142 1727204156.61527: done queuing things up, now waiting for results queue to drain 16142 1727204156.61529: waiting for pending results... 16142 1727204156.61733: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 16142 1727204156.61840: in run() - task 0affcd87-79f5-fddd-f6c7-00000000017a 16142 1727204156.61857: variable 'ansible_search_path' from source: unknown 16142 1727204156.61861: variable 'ansible_search_path' from source: unknown 16142 1727204156.61901: calling self._execute() 16142 1727204156.61981: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204156.61985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204156.61994: variable 'omit' from source: magic vars 16142 1727204156.62345: variable 'ansible_distribution_major_version' from source: facts 16142 1727204156.62373: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204156.62861: variable 'network_provider' from source: set_fact 16142 1727204156.62867: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204156.62870: when evaluation is False, skipping this task 16142 1727204156.62873: _execute() done 16142 1727204156.62875: dumping result to json 16142 1727204156.62880: done dumping result, returning 16142 1727204156.62883: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-fddd-f6c7-00000000017a] 16142 1727204156.62885: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017a 16142 1727204156.62952: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017a 16142 1727204156.62956: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16142 1727204156.62999: no more pending results, returning what we have 16142 1727204156.63002: results queue empty 16142 1727204156.63003: checking for any_errors_fatal 16142 1727204156.63009: done checking for any_errors_fatal 16142 1727204156.63009: checking for max_fail_percentage 16142 1727204156.63011: done checking for max_fail_percentage 16142 1727204156.63012: checking to see if all hosts have failed and the running result is not ok 16142 1727204156.63013: done checking to see if all hosts have failed 16142 1727204156.63014: getting the remaining hosts for this loop 16142 1727204156.63015: done getting the remaining hosts for this loop 16142 1727204156.63018: getting the next task for host managed-node2 16142 1727204156.63024: done getting next task for host managed-node2 16142 1727204156.63028: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204156.63032: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204156.63052: getting variables 16142 1727204156.63054: in VariableManager get_vars() 16142 1727204156.63111: Calling all_inventory to load vars for managed-node2 16142 1727204156.63114: Calling groups_inventory to load vars for managed-node2 16142 1727204156.63117: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204156.63128: Calling all_plugins_play to load vars for managed-node2 16142 1727204156.63131: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204156.63134: Calling groups_plugins_play to load vars for managed-node2 16142 1727204156.64448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204156.65670: done with get_vars() 16142 1727204156.65688: done getting variables 16142 1727204156.65734: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.045) 0:00:55.834 ***** 16142 1727204156.65761: entering _queue_task() for managed-node2/copy 16142 1727204156.65999: worker is 1 (out of 1 available) 16142 1727204156.66012: exiting _queue_task() for managed-node2/copy 16142 1727204156.66025: done queuing things up, now waiting for results queue to drain 16142 1727204156.66027: waiting for pending results... 16142 1727204156.66222: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16142 1727204156.66328: in run() - task 0affcd87-79f5-fddd-f6c7-00000000017b 16142 1727204156.66343: variable 'ansible_search_path' from source: unknown 16142 1727204156.66347: variable 'ansible_search_path' from source: unknown 16142 1727204156.66383: calling self._execute() 16142 1727204156.66463: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204156.66470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204156.66482: variable 'omit' from source: magic vars 16142 1727204156.67534: variable 'ansible_distribution_major_version' from source: facts 16142 1727204156.67538: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204156.67782: variable 'network_provider' from source: set_fact 16142 1727204156.67787: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204156.67790: when evaluation is False, skipping this task 16142 1727204156.67794: _execute() done 16142 1727204156.67796: dumping result to json 16142 1727204156.67801: done dumping result, returning 16142 1727204156.67810: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-fddd-f6c7-00000000017b] 16142 1727204156.67817: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017b 16142 1727204156.68036: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017b 16142 1727204156.68039: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204156.68093: no more pending results, returning what we have 16142 1727204156.68098: results queue empty 16142 1727204156.68099: checking for any_errors_fatal 16142 1727204156.68107: done checking for any_errors_fatal 16142 1727204156.68107: checking for max_fail_percentage 16142 1727204156.68110: done checking for max_fail_percentage 16142 1727204156.68111: checking to see if all hosts have failed and the running result is not ok 16142 1727204156.68111: done checking to see if all hosts have failed 16142 1727204156.68112: getting the remaining hosts for this loop 16142 1727204156.68114: done getting the remaining hosts for this loop 16142 1727204156.68118: getting the next task for host managed-node2 16142 1727204156.68126: done getting next task for host managed-node2 16142 1727204156.68131: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204156.68135: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204156.68163: getting variables 16142 1727204156.68171: in VariableManager get_vars() 16142 1727204156.68225: Calling all_inventory to load vars for managed-node2 16142 1727204156.68228: Calling groups_inventory to load vars for managed-node2 16142 1727204156.68230: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204156.68241: Calling all_plugins_play to load vars for managed-node2 16142 1727204156.68244: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204156.68247: Calling groups_plugins_play to load vars for managed-node2 16142 1727204156.69970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204156.71277: done with get_vars() 16142 1727204156.71297: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.056) 0:00:55.890 ***** 16142 1727204156.71392: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204156.71747: worker is 1 (out of 1 available) 16142 1727204156.71758: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16142 1727204156.71773: done queuing things up, now waiting for results queue to drain 16142 1727204156.71774: waiting for pending results... 16142 1727204156.72070: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16142 1727204156.72260: in run() - task 0affcd87-79f5-fddd-f6c7-00000000017c 16142 1727204156.72287: variable 'ansible_search_path' from source: unknown 16142 1727204156.72295: variable 'ansible_search_path' from source: unknown 16142 1727204156.72344: calling self._execute() 16142 1727204156.72460: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204156.72468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204156.72480: variable 'omit' from source: magic vars 16142 1727204156.72882: variable 'ansible_distribution_major_version' from source: facts 16142 1727204156.72893: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204156.72901: variable 'omit' from source: magic vars 16142 1727204156.72973: variable 'omit' from source: magic vars 16142 1727204156.73156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16142 1727204156.75650: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16142 1727204156.75743: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16142 1727204156.75763: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16142 1727204156.75801: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16142 1727204156.75829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16142 1727204156.75914: variable 'network_provider' from source: set_fact 16142 1727204156.76054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16142 1727204156.76104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16142 1727204156.76130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16142 1727204156.76177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16142 1727204156.76197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16142 1727204156.76274: variable 'omit' from source: magic vars 16142 1727204156.76397: variable 'omit' from source: magic vars 16142 1727204156.76577: variable 'network_connections' from source: task vars 16142 1727204156.76581: variable 'controller_profile' from source: play vars 16142 1727204156.76631: variable 'controller_profile' from source: play vars 16142 1727204156.76759: variable 'omit' from source: magic vars 16142 1727204156.76769: variable '__lsr_ansible_managed' from source: task vars 16142 1727204156.76828: variable '__lsr_ansible_managed' from source: task vars 16142 1727204156.77022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 16142 1727204156.77617: Loaded config def from plugin (lookup/template) 16142 1727204156.77620: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 16142 1727204156.77652: File lookup term: get_ansible_managed.j2 16142 1727204156.77656: variable 'ansible_search_path' from source: unknown 16142 1727204156.77659: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 16142 1727204156.77674: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 16142 1727204156.77690: variable 'ansible_search_path' from source: unknown 16142 1727204156.81250: variable 'ansible_managed' from source: unknown 16142 1727204156.81340: variable 'omit' from source: magic vars 16142 1727204156.81362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204156.81386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204156.81401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204156.81415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204156.81423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204156.81448: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204156.81451: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204156.81454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204156.81523: Set connection var ansible_timeout to 10 16142 1727204156.81526: Set connection var ansible_connection to ssh 16142 1727204156.81529: Set connection var ansible_shell_type to sh 16142 1727204156.81536: Set connection var ansible_shell_executable to /bin/sh 16142 1727204156.81543: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204156.81549: Set connection var ansible_pipelining to False 16142 1727204156.81569: variable 'ansible_shell_executable' from source: unknown 16142 1727204156.81572: variable 'ansible_connection' from source: unknown 16142 1727204156.81574: variable 'ansible_module_compression' from source: unknown 16142 1727204156.81576: variable 'ansible_shell_type' from source: unknown 16142 1727204156.81579: variable 'ansible_shell_executable' from source: unknown 16142 1727204156.81582: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204156.81586: variable 'ansible_pipelining' from source: unknown 16142 1727204156.81588: variable 'ansible_timeout' from source: unknown 16142 1727204156.81592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204156.81692: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204156.81706: variable 'omit' from source: magic vars 16142 1727204156.81709: starting attempt loop 16142 1727204156.81713: running the handler 16142 1727204156.81722: _low_level_execute_command(): starting 16142 1727204156.81730: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204156.82247: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.82256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.82290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204156.82303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204156.82312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.82359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.82381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204156.82385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.82439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.84100: stdout chunk (state=3): >>>/root <<< 16142 1727204156.84193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.84255: stderr chunk (state=3): >>><<< 16142 1727204156.84258: stdout chunk (state=3): >>><<< 16142 1727204156.84279: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204156.84289: _low_level_execute_command(): starting 16142 1727204156.84295: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835 `" && echo ansible-tmp-1727204156.8428004-20286-124340108392835="` echo /root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835 `" ) && sleep 0' 16142 1727204156.84769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.84776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.84803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.84816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.84867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.84880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204156.84890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.84934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.86801: stdout chunk (state=3): >>>ansible-tmp-1727204156.8428004-20286-124340108392835=/root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835 <<< 16142 1727204156.86916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.86967: stderr chunk (state=3): >>><<< 16142 1727204156.86970: stdout chunk (state=3): >>><<< 16142 1727204156.86989: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204156.8428004-20286-124340108392835=/root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204156.87025: variable 'ansible_module_compression' from source: unknown 16142 1727204156.87063: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 16142 1727204156.87088: variable 'ansible_facts' from source: unknown 16142 1727204156.87155: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835/AnsiballZ_network_connections.py 16142 1727204156.87269: Sending initial data 16142 1727204156.87272: Sent initial data (168 bytes) 16142 1727204156.87946: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.87952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204156.87960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.87998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.88010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.88022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.88068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.88081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.88129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.89834: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204156.89872: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204156.89908: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmptejuyavf /root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835/AnsiballZ_network_connections.py <<< 16142 1727204156.89942: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204156.91091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.91199: stderr chunk (state=3): >>><<< 16142 1727204156.91202: stdout chunk (state=3): >>><<< 16142 1727204156.91223: done transferring module to remote 16142 1727204156.91232: _low_level_execute_command(): starting 16142 1727204156.91239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835/ /root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835/AnsiballZ_network_connections.py && sleep 0' 16142 1727204156.91702: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.91708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.91742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.91756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204156.91769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.91814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.91826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.91877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204156.93576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204156.93629: stderr chunk (state=3): >>><<< 16142 1727204156.93632: stdout chunk (state=3): >>><<< 16142 1727204156.93648: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204156.93651: _low_level_execute_command(): starting 16142 1727204156.93656: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835/AnsiballZ_network_connections.py && sleep 0' 16142 1727204156.94140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204156.94168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.94181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204156.94229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204156.94245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204156.94261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204156.94304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204157.27959: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_8lnwogq2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_8lnwogq2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 16142 1727204157.27972: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/da90ddbf-a91a-40cb-8cf8-f4fc8a58a465: error=unknown <<< 16142 1727204157.28172: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 16142 1727204157.29823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204157.29881: stderr chunk (state=3): >>><<< 16142 1727204157.29885: stdout chunk (state=3): >>><<< 16142 1727204157.29899: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_8lnwogq2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_8lnwogq2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/da90ddbf-a91a-40cb-8cf8-f4fc8a58a465: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204157.29931: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'down', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204157.29941: _low_level_execute_command(): starting 16142 1727204157.29944: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204156.8428004-20286-124340108392835/ > /dev/null 2>&1 && sleep 0' 16142 1727204157.30408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.30413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.30448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.30460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.30513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204157.30524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204157.30529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204157.30589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204157.32519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204157.32523: stdout chunk (state=3): >>><<< 16142 1727204157.32528: stderr chunk (state=3): >>><<< 16142 1727204157.32547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204157.32553: handler run complete 16142 1727204157.32585: attempt loop complete, returning result 16142 1727204157.32588: _execute() done 16142 1727204157.32595: dumping result to json 16142 1727204157.32597: done dumping result, returning 16142 1727204157.32604: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-fddd-f6c7-00000000017c] 16142 1727204157.32610: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017c 16142 1727204157.32721: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017c 16142 1727204157.32723: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 16142 1727204157.32822: no more pending results, returning what we have 16142 1727204157.32826: results queue empty 16142 1727204157.32827: checking for any_errors_fatal 16142 1727204157.32833: done checking for any_errors_fatal 16142 1727204157.32833: checking for max_fail_percentage 16142 1727204157.32835: done checking for max_fail_percentage 16142 1727204157.32838: checking to see if all hosts have failed and the running result is not ok 16142 1727204157.32839: done checking to see if all hosts have failed 16142 1727204157.32840: getting the remaining hosts for this loop 16142 1727204157.32841: done getting the remaining hosts for this loop 16142 1727204157.32844: getting the next task for host managed-node2 16142 1727204157.32851: done getting next task for host managed-node2 16142 1727204157.32855: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204157.32858: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204157.32876: getting variables 16142 1727204157.32878: in VariableManager get_vars() 16142 1727204157.32924: Calling all_inventory to load vars for managed-node2 16142 1727204157.32927: Calling groups_inventory to load vars for managed-node2 16142 1727204157.32929: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204157.32940: Calling all_plugins_play to load vars for managed-node2 16142 1727204157.32943: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204157.32945: Calling groups_plugins_play to load vars for managed-node2 16142 1727204157.34595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204157.36378: done with get_vars() 16142 1727204157.36404: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.651) 0:00:56.541 ***** 16142 1727204157.36501: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204157.36863: worker is 1 (out of 1 available) 16142 1727204157.36879: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16142 1727204157.36891: done queuing things up, now waiting for results queue to drain 16142 1727204157.36893: waiting for pending results... 16142 1727204157.37197: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 16142 1727204157.37373: in run() - task 0affcd87-79f5-fddd-f6c7-00000000017d 16142 1727204157.37395: variable 'ansible_search_path' from source: unknown 16142 1727204157.37402: variable 'ansible_search_path' from source: unknown 16142 1727204157.37450: calling self._execute() 16142 1727204157.37562: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.37577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.37591: variable 'omit' from source: magic vars 16142 1727204157.37999: variable 'ansible_distribution_major_version' from source: facts 16142 1727204157.38018: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204157.38155: variable 'network_state' from source: role '' defaults 16142 1727204157.38172: Evaluated conditional (network_state != {}): False 16142 1727204157.38180: when evaluation is False, skipping this task 16142 1727204157.38192: _execute() done 16142 1727204157.38205: dumping result to json 16142 1727204157.38213: done dumping result, returning 16142 1727204157.38223: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-fddd-f6c7-00000000017d] 16142 1727204157.38233: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16142 1727204157.38392: no more pending results, returning what we have 16142 1727204157.38397: results queue empty 16142 1727204157.38398: checking for any_errors_fatal 16142 1727204157.38413: done checking for any_errors_fatal 16142 1727204157.38414: checking for max_fail_percentage 16142 1727204157.38417: done checking for max_fail_percentage 16142 1727204157.38418: checking to see if all hosts have failed and the running result is not ok 16142 1727204157.38419: done checking to see if all hosts have failed 16142 1727204157.38419: getting the remaining hosts for this loop 16142 1727204157.38421: done getting the remaining hosts for this loop 16142 1727204157.38424: getting the next task for host managed-node2 16142 1727204157.38432: done getting next task for host managed-node2 16142 1727204157.38440: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204157.38446: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204157.38472: getting variables 16142 1727204157.38474: in VariableManager get_vars() 16142 1727204157.38529: Calling all_inventory to load vars for managed-node2 16142 1727204157.38534: Calling groups_inventory to load vars for managed-node2 16142 1727204157.38538: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204157.38550: Calling all_plugins_play to load vars for managed-node2 16142 1727204157.38554: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204157.38556: Calling groups_plugins_play to load vars for managed-node2 16142 1727204157.39504: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017d 16142 1727204157.39508: WORKER PROCESS EXITING 16142 1727204157.40341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204157.42131: done with get_vars() 16142 1727204157.42158: done getting variables 16142 1727204157.42220: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.057) 0:00:56.599 ***** 16142 1727204157.42266: entering _queue_task() for managed-node2/debug 16142 1727204157.42589: worker is 1 (out of 1 available) 16142 1727204157.42601: exiting _queue_task() for managed-node2/debug 16142 1727204157.42613: done queuing things up, now waiting for results queue to drain 16142 1727204157.42614: waiting for pending results... 16142 1727204157.42930: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16142 1727204157.43114: in run() - task 0affcd87-79f5-fddd-f6c7-00000000017e 16142 1727204157.43139: variable 'ansible_search_path' from source: unknown 16142 1727204157.43148: variable 'ansible_search_path' from source: unknown 16142 1727204157.43196: calling self._execute() 16142 1727204157.43318: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.43334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.43351: variable 'omit' from source: magic vars 16142 1727204157.43788: variable 'ansible_distribution_major_version' from source: facts 16142 1727204157.43807: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204157.43820: variable 'omit' from source: magic vars 16142 1727204157.43896: variable 'omit' from source: magic vars 16142 1727204157.43940: variable 'omit' from source: magic vars 16142 1727204157.43991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204157.44029: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204157.44063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204157.44091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204157.44107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204157.44145: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204157.44158: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.44167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.44277: Set connection var ansible_timeout to 10 16142 1727204157.44284: Set connection var ansible_connection to ssh 16142 1727204157.44294: Set connection var ansible_shell_type to sh 16142 1727204157.44308: Set connection var ansible_shell_executable to /bin/sh 16142 1727204157.44318: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204157.44331: Set connection var ansible_pipelining to False 16142 1727204157.44362: variable 'ansible_shell_executable' from source: unknown 16142 1727204157.44376: variable 'ansible_connection' from source: unknown 16142 1727204157.44385: variable 'ansible_module_compression' from source: unknown 16142 1727204157.44392: variable 'ansible_shell_type' from source: unknown 16142 1727204157.44398: variable 'ansible_shell_executable' from source: unknown 16142 1727204157.44406: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.44420: variable 'ansible_pipelining' from source: unknown 16142 1727204157.44428: variable 'ansible_timeout' from source: unknown 16142 1727204157.44439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.44596: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204157.44611: variable 'omit' from source: magic vars 16142 1727204157.44620: starting attempt loop 16142 1727204157.44628: running the handler 16142 1727204157.44768: variable '__network_connections_result' from source: set_fact 16142 1727204157.44829: handler run complete 16142 1727204157.44858: attempt loop complete, returning result 16142 1727204157.44867: _execute() done 16142 1727204157.44873: dumping result to json 16142 1727204157.44880: done dumping result, returning 16142 1727204157.44891: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-fddd-f6c7-00000000017e] 16142 1727204157.44899: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017e 16142 1727204157.45013: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017e ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 16142 1727204157.45090: no more pending results, returning what we have 16142 1727204157.45095: results queue empty 16142 1727204157.45096: checking for any_errors_fatal 16142 1727204157.45103: done checking for any_errors_fatal 16142 1727204157.45104: checking for max_fail_percentage 16142 1727204157.45106: done checking for max_fail_percentage 16142 1727204157.45107: checking to see if all hosts have failed and the running result is not ok 16142 1727204157.45108: done checking to see if all hosts have failed 16142 1727204157.45108: getting the remaining hosts for this loop 16142 1727204157.45111: done getting the remaining hosts for this loop 16142 1727204157.45114: getting the next task for host managed-node2 16142 1727204157.45124: done getting next task for host managed-node2 16142 1727204157.45128: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204157.45132: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204157.45149: getting variables 16142 1727204157.45151: in VariableManager get_vars() 16142 1727204157.45209: Calling all_inventory to load vars for managed-node2 16142 1727204157.45212: Calling groups_inventory to load vars for managed-node2 16142 1727204157.45216: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204157.45227: Calling all_plugins_play to load vars for managed-node2 16142 1727204157.45230: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204157.45233: Calling groups_plugins_play to load vars for managed-node2 16142 1727204157.46202: WORKER PROCESS EXITING 16142 1727204157.47267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204157.49001: done with get_vars() 16142 1727204157.49030: done getting variables 16142 1727204157.49103: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.068) 0:00:56.668 ***** 16142 1727204157.49144: entering _queue_task() for managed-node2/debug 16142 1727204157.49496: worker is 1 (out of 1 available) 16142 1727204157.49513: exiting _queue_task() for managed-node2/debug 16142 1727204157.49526: done queuing things up, now waiting for results queue to drain 16142 1727204157.49527: waiting for pending results... 16142 1727204157.49853: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16142 1727204157.50012: in run() - task 0affcd87-79f5-fddd-f6c7-00000000017f 16142 1727204157.50035: variable 'ansible_search_path' from source: unknown 16142 1727204157.50048: variable 'ansible_search_path' from source: unknown 16142 1727204157.50099: calling self._execute() 16142 1727204157.50213: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.50226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.50247: variable 'omit' from source: magic vars 16142 1727204157.50658: variable 'ansible_distribution_major_version' from source: facts 16142 1727204157.50679: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204157.50691: variable 'omit' from source: magic vars 16142 1727204157.50770: variable 'omit' from source: magic vars 16142 1727204157.50813: variable 'omit' from source: magic vars 16142 1727204157.50871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204157.50911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204157.50948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204157.50975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204157.50993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204157.51028: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204157.51045: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.51053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.51168: Set connection var ansible_timeout to 10 16142 1727204157.51178: Set connection var ansible_connection to ssh 16142 1727204157.51188: Set connection var ansible_shell_type to sh 16142 1727204157.51198: Set connection var ansible_shell_executable to /bin/sh 16142 1727204157.51206: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204157.51216: Set connection var ansible_pipelining to False 16142 1727204157.51243: variable 'ansible_shell_executable' from source: unknown 16142 1727204157.51251: variable 'ansible_connection' from source: unknown 16142 1727204157.51259: variable 'ansible_module_compression' from source: unknown 16142 1727204157.51266: variable 'ansible_shell_type' from source: unknown 16142 1727204157.51272: variable 'ansible_shell_executable' from source: unknown 16142 1727204157.51283: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.51291: variable 'ansible_pipelining' from source: unknown 16142 1727204157.51296: variable 'ansible_timeout' from source: unknown 16142 1727204157.51303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.51460: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204157.51483: variable 'omit' from source: magic vars 16142 1727204157.51495: starting attempt loop 16142 1727204157.51504: running the handler 16142 1727204157.51558: variable '__network_connections_result' from source: set_fact 16142 1727204157.51651: variable '__network_connections_result' from source: set_fact 16142 1727204157.51776: handler run complete 16142 1727204157.51807: attempt loop complete, returning result 16142 1727204157.51814: _execute() done 16142 1727204157.51818: dumping result to json 16142 1727204157.51828: done dumping result, returning 16142 1727204157.51840: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-fddd-f6c7-00000000017f] 16142 1727204157.51849: sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017f ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 16142 1727204157.52046: no more pending results, returning what we have 16142 1727204157.52051: results queue empty 16142 1727204157.52052: checking for any_errors_fatal 16142 1727204157.52061: done checking for any_errors_fatal 16142 1727204157.52061: checking for max_fail_percentage 16142 1727204157.52064: done checking for max_fail_percentage 16142 1727204157.52065: checking to see if all hosts have failed and the running result is not ok 16142 1727204157.52066: done checking to see if all hosts have failed 16142 1727204157.52066: getting the remaining hosts for this loop 16142 1727204157.52069: done getting the remaining hosts for this loop 16142 1727204157.52072: getting the next task for host managed-node2 16142 1727204157.52080: done getting next task for host managed-node2 16142 1727204157.52083: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204157.52087: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204157.52100: getting variables 16142 1727204157.52102: in VariableManager get_vars() 16142 1727204157.52159: Calling all_inventory to load vars for managed-node2 16142 1727204157.52162: Calling groups_inventory to load vars for managed-node2 16142 1727204157.52165: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204157.52176: Calling all_plugins_play to load vars for managed-node2 16142 1727204157.52178: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204157.52181: Calling groups_plugins_play to load vars for managed-node2 16142 1727204157.53107: done sending task result for task 0affcd87-79f5-fddd-f6c7-00000000017f 16142 1727204157.53111: WORKER PROCESS EXITING 16142 1727204157.54092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204157.55893: done with get_vars() 16142 1727204157.55937: done getting variables 16142 1727204157.56016: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.069) 0:00:56.737 ***** 16142 1727204157.56057: entering _queue_task() for managed-node2/debug 16142 1727204157.56426: worker is 1 (out of 1 available) 16142 1727204157.56441: exiting _queue_task() for managed-node2/debug 16142 1727204157.56455: done queuing things up, now waiting for results queue to drain 16142 1727204157.56457: waiting for pending results... 16142 1727204157.56789: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16142 1727204157.56963: in run() - task 0affcd87-79f5-fddd-f6c7-000000000180 16142 1727204157.56987: variable 'ansible_search_path' from source: unknown 16142 1727204157.56995: variable 'ansible_search_path' from source: unknown 16142 1727204157.57045: calling self._execute() 16142 1727204157.57161: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.57180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.57197: variable 'omit' from source: magic vars 16142 1727204157.57601: variable 'ansible_distribution_major_version' from source: facts 16142 1727204157.57620: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204157.57754: variable 'network_state' from source: role '' defaults 16142 1727204157.57766: Evaluated conditional (network_state != {}): False 16142 1727204157.57770: when evaluation is False, skipping this task 16142 1727204157.57773: _execute() done 16142 1727204157.57776: dumping result to json 16142 1727204157.57787: done dumping result, returning 16142 1727204157.57794: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-fddd-f6c7-000000000180] 16142 1727204157.57801: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000180 16142 1727204157.57905: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000180 16142 1727204157.57909: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 16142 1727204157.57963: no more pending results, returning what we have 16142 1727204157.57969: results queue empty 16142 1727204157.57970: checking for any_errors_fatal 16142 1727204157.57982: done checking for any_errors_fatal 16142 1727204157.57983: checking for max_fail_percentage 16142 1727204157.57985: done checking for max_fail_percentage 16142 1727204157.57986: checking to see if all hosts have failed and the running result is not ok 16142 1727204157.57987: done checking to see if all hosts have failed 16142 1727204157.57988: getting the remaining hosts for this loop 16142 1727204157.57989: done getting the remaining hosts for this loop 16142 1727204157.57993: getting the next task for host managed-node2 16142 1727204157.58002: done getting next task for host managed-node2 16142 1727204157.58007: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204157.58013: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204157.58041: getting variables 16142 1727204157.58043: in VariableManager get_vars() 16142 1727204157.58104: Calling all_inventory to load vars for managed-node2 16142 1727204157.58108: Calling groups_inventory to load vars for managed-node2 16142 1727204157.58111: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204157.58123: Calling all_plugins_play to load vars for managed-node2 16142 1727204157.58127: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204157.58130: Calling groups_plugins_play to load vars for managed-node2 16142 1727204157.59845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204157.60907: done with get_vars() 16142 1727204157.60925: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.049) 0:00:56.786 ***** 16142 1727204157.61001: entering _queue_task() for managed-node2/ping 16142 1727204157.61230: worker is 1 (out of 1 available) 16142 1727204157.61243: exiting _queue_task() for managed-node2/ping 16142 1727204157.61256: done queuing things up, now waiting for results queue to drain 16142 1727204157.61257: waiting for pending results... 16142 1727204157.61451: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 16142 1727204157.61630: in run() - task 0affcd87-79f5-fddd-f6c7-000000000181 16142 1727204157.61662: variable 'ansible_search_path' from source: unknown 16142 1727204157.61675: variable 'ansible_search_path' from source: unknown 16142 1727204157.61714: calling self._execute() 16142 1727204157.61820: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.61830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.61843: variable 'omit' from source: magic vars 16142 1727204157.62231: variable 'ansible_distribution_major_version' from source: facts 16142 1727204157.62247: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204157.62266: variable 'omit' from source: magic vars 16142 1727204157.62346: variable 'omit' from source: magic vars 16142 1727204157.62390: variable 'omit' from source: magic vars 16142 1727204157.62443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204157.62483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204157.62510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204157.62540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204157.62557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204157.62593: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204157.62601: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.62609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.62719: Set connection var ansible_timeout to 10 16142 1727204157.62728: Set connection var ansible_connection to ssh 16142 1727204157.62741: Set connection var ansible_shell_type to sh 16142 1727204157.62756: Set connection var ansible_shell_executable to /bin/sh 16142 1727204157.62768: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204157.62781: Set connection var ansible_pipelining to False 16142 1727204157.62808: variable 'ansible_shell_executable' from source: unknown 16142 1727204157.62817: variable 'ansible_connection' from source: unknown 16142 1727204157.62825: variable 'ansible_module_compression' from source: unknown 16142 1727204157.62832: variable 'ansible_shell_type' from source: unknown 16142 1727204157.62838: variable 'ansible_shell_executable' from source: unknown 16142 1727204157.62844: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204157.62852: variable 'ansible_pipelining' from source: unknown 16142 1727204157.62867: variable 'ansible_timeout' from source: unknown 16142 1727204157.62876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204157.63091: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 16142 1727204157.63108: variable 'omit' from source: magic vars 16142 1727204157.63116: starting attempt loop 16142 1727204157.63121: running the handler 16142 1727204157.63135: _low_level_execute_command(): starting 16142 1727204157.63145: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204157.64216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.64225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.64270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.64274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.64276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.64334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204157.64337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204157.64340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204157.64390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204157.66057: stdout chunk (state=3): >>>/root <<< 16142 1727204157.66167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204157.66239: stderr chunk (state=3): >>><<< 16142 1727204157.66243: stdout chunk (state=3): >>><<< 16142 1727204157.66353: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204157.66356: _low_level_execute_command(): starting 16142 1727204157.66359: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677 `" && echo ansible-tmp-1727204157.6626287-20399-21820426483677="` echo /root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677 `" ) && sleep 0' 16142 1727204157.66921: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204157.66934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.66949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.66972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.67013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204157.67034: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204157.67049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.67070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204157.67188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204157.67202: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204157.67216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.67231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.67247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.67260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204157.67275: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204157.67289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.67366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204157.67382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204157.67396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204157.67717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204157.69654: stdout chunk (state=3): >>>ansible-tmp-1727204157.6626287-20399-21820426483677=/root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677 <<< 16142 1727204157.69778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204157.69868: stderr chunk (state=3): >>><<< 16142 1727204157.69872: stdout chunk (state=3): >>><<< 16142 1727204157.70180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204157.6626287-20399-21820426483677=/root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204157.70183: variable 'ansible_module_compression' from source: unknown 16142 1727204157.70186: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 16142 1727204157.70188: variable 'ansible_facts' from source: unknown 16142 1727204157.70189: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677/AnsiballZ_ping.py 16142 1727204157.70246: Sending initial data 16142 1727204157.70249: Sent initial data (152 bytes) 16142 1727204157.71253: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204157.71380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.71393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.71409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.71446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204157.71459: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204157.71476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.71495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204157.71508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204157.71519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204157.71533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.71548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.71566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.71582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204157.71594: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204157.71609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.71788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204157.71805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204157.71820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204157.72038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204157.73868: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204157.73905: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204157.73948: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpr1aejypl /root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677/AnsiballZ_ping.py <<< 16142 1727204157.73997: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204157.75372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204157.75470: stderr chunk (state=3): >>><<< 16142 1727204157.75474: stdout chunk (state=3): >>><<< 16142 1727204157.75476: done transferring module to remote 16142 1727204157.75478: _low_level_execute_command(): starting 16142 1727204157.75480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677/ /root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677/AnsiballZ_ping.py && sleep 0' 16142 1727204157.76756: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.76760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.76803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.76806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.76810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.76812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.76878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204157.76882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204157.76938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204157.78782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204157.78853: stderr chunk (state=3): >>><<< 16142 1727204157.78856: stdout chunk (state=3): >>><<< 16142 1727204157.78954: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204157.78958: _low_level_execute_command(): starting 16142 1727204157.78960: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677/AnsiballZ_ping.py && sleep 0' 16142 1727204157.80595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.80599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.80634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.80637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.80640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.80801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204157.80807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204157.80809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204157.80880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204157.94380: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 16142 1727204157.95585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204157.95589: stdout chunk (state=3): >>><<< 16142 1727204157.95592: stderr chunk (state=3): >>><<< 16142 1727204157.95608: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204157.95635: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204157.95648: _low_level_execute_command(): starting 16142 1727204157.95653: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204157.6626287-20399-21820426483677/ > /dev/null 2>&1 && sleep 0' 16142 1727204157.97088: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204157.97343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.97360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.97382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.97428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204157.97444: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204157.97459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.97481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204157.97498: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204157.97510: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204157.97522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204157.97538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204157.97560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204157.97577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204157.97590: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204157.97604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204157.97796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204157.97814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204157.97830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204157.97983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204157.99786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204157.99888: stderr chunk (state=3): >>><<< 16142 1727204157.99891: stdout chunk (state=3): >>><<< 16142 1727204157.99972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204157.99976: handler run complete 16142 1727204157.99978: attempt loop complete, returning result 16142 1727204157.99980: _execute() done 16142 1727204157.99982: dumping result to json 16142 1727204157.99984: done dumping result, returning 16142 1727204157.99986: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-fddd-f6c7-000000000181] 16142 1727204157.99988: sending task result for task 0affcd87-79f5-fddd-f6c7-000000000181 16142 1727204158.00340: done sending task result for task 0affcd87-79f5-fddd-f6c7-000000000181 16142 1727204158.00343: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 16142 1727204158.00438: no more pending results, returning what we have 16142 1727204158.00442: results queue empty 16142 1727204158.00443: checking for any_errors_fatal 16142 1727204158.00449: done checking for any_errors_fatal 16142 1727204158.00450: checking for max_fail_percentage 16142 1727204158.00452: done checking for max_fail_percentage 16142 1727204158.00453: checking to see if all hosts have failed and the running result is not ok 16142 1727204158.00454: done checking to see if all hosts have failed 16142 1727204158.00455: getting the remaining hosts for this loop 16142 1727204158.00456: done getting the remaining hosts for this loop 16142 1727204158.00459: getting the next task for host managed-node2 16142 1727204158.00472: done getting next task for host managed-node2 16142 1727204158.00475: ^ task is: TASK: meta (role_complete) 16142 1727204158.00479: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204158.00493: getting variables 16142 1727204158.00495: in VariableManager get_vars() 16142 1727204158.00546: Calling all_inventory to load vars for managed-node2 16142 1727204158.00549: Calling groups_inventory to load vars for managed-node2 16142 1727204158.00552: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204158.00561: Calling all_plugins_play to load vars for managed-node2 16142 1727204158.00566: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204158.00570: Calling groups_plugins_play to load vars for managed-node2 16142 1727204158.03137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204158.07043: done with get_vars() 16142 1727204158.07078: done getting variables 16142 1727204158.07162: done queuing things up, now waiting for results queue to drain 16142 1727204158.07710: results queue empty 16142 1727204158.07712: checking for any_errors_fatal 16142 1727204158.07716: done checking for any_errors_fatal 16142 1727204158.07716: checking for max_fail_percentage 16142 1727204158.07718: done checking for max_fail_percentage 16142 1727204158.07718: checking to see if all hosts have failed and the running result is not ok 16142 1727204158.07719: done checking to see if all hosts have failed 16142 1727204158.07720: getting the remaining hosts for this loop 16142 1727204158.07721: done getting the remaining hosts for this loop 16142 1727204158.07724: getting the next task for host managed-node2 16142 1727204158.07729: done getting next task for host managed-node2 16142 1727204158.07731: ^ task is: TASK: Delete the device '{{ controller_device }}' 16142 1727204158.07734: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204158.07737: getting variables 16142 1727204158.07738: in VariableManager get_vars() 16142 1727204158.07762: Calling all_inventory to load vars for managed-node2 16142 1727204158.07767: Calling groups_inventory to load vars for managed-node2 16142 1727204158.07769: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204158.07775: Calling all_plugins_play to load vars for managed-node2 16142 1727204158.07778: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204158.07781: Calling groups_plugins_play to load vars for managed-node2 16142 1727204158.10040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204158.13253: done with get_vars() 16142 1727204158.13998: done getting variables 16142 1727204158.14045: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 16142 1727204158.14169: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Tuesday 24 September 2024 14:55:58 -0400 (0:00:00.531) 0:00:57.318 ***** 16142 1727204158.14200: entering _queue_task() for managed-node2/command 16142 1727204158.14550: worker is 1 (out of 1 available) 16142 1727204158.14562: exiting _queue_task() for managed-node2/command 16142 1727204158.14576: done queuing things up, now waiting for results queue to drain 16142 1727204158.14577: waiting for pending results... 16142 1727204158.15597: running TaskExecutor() for managed-node2/TASK: Delete the device 'nm-bond' 16142 1727204158.15832: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001b1 16142 1727204158.15937: variable 'ansible_search_path' from source: unknown 16142 1727204158.15983: calling self._execute() 16142 1727204158.16203: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204158.16214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204158.16273: variable 'omit' from source: magic vars 16142 1727204158.16957: variable 'ansible_distribution_major_version' from source: facts 16142 1727204158.17138: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204158.17151: variable 'omit' from source: magic vars 16142 1727204158.17182: variable 'omit' from source: magic vars 16142 1727204158.17288: variable 'controller_device' from source: play vars 16142 1727204158.17361: variable 'omit' from source: magic vars 16142 1727204158.17494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204158.17531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204158.17583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204158.17687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204158.17701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204158.17733: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204158.17776: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204158.17785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204158.17990: Set connection var ansible_timeout to 10 16142 1727204158.17999: Set connection var ansible_connection to ssh 16142 1727204158.18007: Set connection var ansible_shell_type to sh 16142 1727204158.18016: Set connection var ansible_shell_executable to /bin/sh 16142 1727204158.18024: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204158.18033: Set connection var ansible_pipelining to False 16142 1727204158.18056: variable 'ansible_shell_executable' from source: unknown 16142 1727204158.18101: variable 'ansible_connection' from source: unknown 16142 1727204158.18108: variable 'ansible_module_compression' from source: unknown 16142 1727204158.18115: variable 'ansible_shell_type' from source: unknown 16142 1727204158.18121: variable 'ansible_shell_executable' from source: unknown 16142 1727204158.18127: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204158.18211: variable 'ansible_pipelining' from source: unknown 16142 1727204158.18218: variable 'ansible_timeout' from source: unknown 16142 1727204158.18225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204158.18477: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204158.18493: variable 'omit' from source: magic vars 16142 1727204158.18501: starting attempt loop 16142 1727204158.18508: running the handler 16142 1727204158.18640: _low_level_execute_command(): starting 16142 1727204158.18654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204158.19732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.19737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.19775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204158.19778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.19781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204158.19783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.19849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.19853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.19855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.19921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.21580: stdout chunk (state=3): >>>/root <<< 16142 1727204158.21678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204158.21760: stderr chunk (state=3): >>><<< 16142 1727204158.21767: stdout chunk (state=3): >>><<< 16142 1727204158.21882: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204158.21886: _low_level_execute_command(): starting 16142 1727204158.21889: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400 `" && echo ansible-tmp-1727204158.217892-20452-84650480020400="` echo /root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400 `" ) && sleep 0' 16142 1727204158.22473: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204158.22487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.22501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.22519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.22571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.22595: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204158.22610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.22630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204158.22648: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204158.22660: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204158.22676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.22691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.22706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.22718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.22729: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204158.22744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.22825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.22842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.22860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.22934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.24794: stdout chunk (state=3): >>>ansible-tmp-1727204158.217892-20452-84650480020400=/root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400 <<< 16142 1727204158.24919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204158.25696: stderr chunk (state=3): >>><<< 16142 1727204158.25702: stdout chunk (state=3): >>><<< 16142 1727204158.25772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204158.217892-20452-84650480020400=/root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204158.25776: variable 'ansible_module_compression' from source: unknown 16142 1727204158.25871: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204158.25981: variable 'ansible_facts' from source: unknown 16142 1727204158.25984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400/AnsiballZ_command.py 16142 1727204158.26909: Sending initial data 16142 1727204158.26912: Sent initial data (154 bytes) 16142 1727204158.28708: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204158.28724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.28738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.28757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.28812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.28825: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204158.28839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.28858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204158.28874: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204158.28887: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204158.28903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.28920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.28936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.28950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.28963: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204158.28980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.29060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.29080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.29095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.29171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.30982: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204158.31016: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204158.31058: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp2uwup7v7 /root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400/AnsiballZ_command.py <<< 16142 1727204158.31096: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204158.32578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204158.32672: stderr chunk (state=3): >>><<< 16142 1727204158.32676: stdout chunk (state=3): >>><<< 16142 1727204158.32678: done transferring module to remote 16142 1727204158.32680: _low_level_execute_command(): starting 16142 1727204158.32683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400/ /root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400/AnsiballZ_command.py && sleep 0' 16142 1727204158.33330: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204158.33347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.33366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.33391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.33436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.33451: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204158.33467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.33485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204158.33496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204158.33506: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204158.33516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.33527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.33541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.33556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.33568: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204158.33581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.33655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.33677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.33692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.33765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.35544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204158.35626: stderr chunk (state=3): >>><<< 16142 1727204158.35629: stdout chunk (state=3): >>><<< 16142 1727204158.35733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204158.35738: _low_level_execute_command(): starting 16142 1727204158.35741: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400/AnsiballZ_command.py && sleep 0' 16142 1727204158.36357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204158.36373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.36394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.36411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.36455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.36468: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204158.36481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.36505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204158.36516: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204158.36526: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204158.36536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.36549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.36563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.36579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.36590: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204158.36606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.36686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.36704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.36723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.36802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.50757: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:58.499112", "end": "2024-09-24 14:55:58.506747", "delta": "0:00:00.007635", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204158.51887: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 16142 1727204158.51948: stderr chunk (state=3): >>><<< 16142 1727204158.51951: stdout chunk (state=3): >>><<< 16142 1727204158.52074: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:58.499112", "end": "2024-09-24 14:55:58.506747", "delta": "0:00:00.007635", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 16142 1727204158.52086: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204158.52089: _low_level_execute_command(): starting 16142 1727204158.52092: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204158.217892-20452-84650480020400/ > /dev/null 2>&1 && sleep 0' 16142 1727204158.52602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.52605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.52650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.52655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.52657: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.52723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.52740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.52804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.54688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204158.54718: stderr chunk (state=3): >>><<< 16142 1727204158.54722: stdout chunk (state=3): >>><<< 16142 1727204158.54744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204158.54751: handler run complete 16142 1727204158.54782: Evaluated conditional (False): False 16142 1727204158.54786: Evaluated conditional (False): False 16142 1727204158.54799: attempt loop complete, returning result 16142 1727204158.54802: _execute() done 16142 1727204158.54805: dumping result to json 16142 1727204158.54810: done dumping result, returning 16142 1727204158.54832: done running TaskExecutor() for managed-node2/TASK: Delete the device 'nm-bond' [0affcd87-79f5-fddd-f6c7-0000000001b1] 16142 1727204158.54835: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b1 16142 1727204158.54976: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b1 16142 1727204158.54979: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007635", "end": "2024-09-24 14:55:58.506747", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:55:58.499112" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 16142 1727204158.55045: no more pending results, returning what we have 16142 1727204158.55048: results queue empty 16142 1727204158.55049: checking for any_errors_fatal 16142 1727204158.55051: done checking for any_errors_fatal 16142 1727204158.55052: checking for max_fail_percentage 16142 1727204158.55053: done checking for max_fail_percentage 16142 1727204158.55054: checking to see if all hosts have failed and the running result is not ok 16142 1727204158.55055: done checking to see if all hosts have failed 16142 1727204158.55056: getting the remaining hosts for this loop 16142 1727204158.55057: done getting the remaining hosts for this loop 16142 1727204158.55061: getting the next task for host managed-node2 16142 1727204158.55071: done getting next task for host managed-node2 16142 1727204158.55074: ^ task is: TASK: Remove test interfaces 16142 1727204158.55078: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204158.55082: getting variables 16142 1727204158.55084: in VariableManager get_vars() 16142 1727204158.55132: Calling all_inventory to load vars for managed-node2 16142 1727204158.55135: Calling groups_inventory to load vars for managed-node2 16142 1727204158.55137: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204158.55147: Calling all_plugins_play to load vars for managed-node2 16142 1727204158.55149: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204158.55152: Calling groups_plugins_play to load vars for managed-node2 16142 1727204158.70639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204158.74168: done with get_vars() 16142 1727204158.74206: done getting variables 16142 1727204158.74261: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:55:58 -0400 (0:00:00.600) 0:00:57.919 ***** 16142 1727204158.74299: entering _queue_task() for managed-node2/shell 16142 1727204158.75062: worker is 1 (out of 1 available) 16142 1727204158.75280: exiting _queue_task() for managed-node2/shell 16142 1727204158.75291: done queuing things up, now waiting for results queue to drain 16142 1727204158.75293: waiting for pending results... 16142 1727204158.75978: running TaskExecutor() for managed-node2/TASK: Remove test interfaces 16142 1727204158.76376: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001b5 16142 1727204158.76444: variable 'ansible_search_path' from source: unknown 16142 1727204158.76541: variable 'ansible_search_path' from source: unknown 16142 1727204158.76585: calling self._execute() 16142 1727204158.76810: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204158.76822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204158.76834: variable 'omit' from source: magic vars 16142 1727204158.77561: variable 'ansible_distribution_major_version' from source: facts 16142 1727204158.77741: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204158.77755: variable 'omit' from source: magic vars 16142 1727204158.77830: variable 'omit' from source: magic vars 16142 1727204158.78212: variable 'dhcp_interface1' from source: play vars 16142 1727204158.78224: variable 'dhcp_interface2' from source: play vars 16142 1727204158.78296: variable 'omit' from source: magic vars 16142 1727204158.78344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204158.78527: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204158.78556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204158.78581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204158.78607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204158.78645: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204158.78717: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204158.78726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204158.79303: Set connection var ansible_timeout to 10 16142 1727204158.79311: Set connection var ansible_connection to ssh 16142 1727204158.79322: Set connection var ansible_shell_type to sh 16142 1727204158.79331: Set connection var ansible_shell_executable to /bin/sh 16142 1727204158.79340: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204158.79352: Set connection var ansible_pipelining to False 16142 1727204158.79390: variable 'ansible_shell_executable' from source: unknown 16142 1727204158.79498: variable 'ansible_connection' from source: unknown 16142 1727204158.79506: variable 'ansible_module_compression' from source: unknown 16142 1727204158.79513: variable 'ansible_shell_type' from source: unknown 16142 1727204158.79519: variable 'ansible_shell_executable' from source: unknown 16142 1727204158.79526: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204158.79534: variable 'ansible_pipelining' from source: unknown 16142 1727204158.79542: variable 'ansible_timeout' from source: unknown 16142 1727204158.79550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204158.79855: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204158.79875: variable 'omit' from source: magic vars 16142 1727204158.79886: starting attempt loop 16142 1727204158.79892: running the handler 16142 1727204158.79906: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204158.79936: _low_level_execute_command(): starting 16142 1727204158.80044: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204158.82001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.82007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.82040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204158.82043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.82047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204158.82049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.82208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.82221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.82289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.83949: stdout chunk (state=3): >>>/root <<< 16142 1727204158.84049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204158.84145: stderr chunk (state=3): >>><<< 16142 1727204158.84149: stdout chunk (state=3): >>><<< 16142 1727204158.84276: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204158.84280: _low_level_execute_command(): starting 16142 1727204158.84283: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189 `" && echo ansible-tmp-1727204158.841752-20528-234377907625189="` echo /root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189 `" ) && sleep 0' 16142 1727204158.85609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.85728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.85732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.85776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204158.85781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.85784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204158.85786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.85954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.85986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.85989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.86053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.87910: stdout chunk (state=3): >>>ansible-tmp-1727204158.841752-20528-234377907625189=/root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189 <<< 16142 1727204158.88027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204158.88118: stderr chunk (state=3): >>><<< 16142 1727204158.88122: stdout chunk (state=3): >>><<< 16142 1727204158.88374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204158.841752-20528-234377907625189=/root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204158.88377: variable 'ansible_module_compression' from source: unknown 16142 1727204158.88380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204158.88383: variable 'ansible_facts' from source: unknown 16142 1727204158.88384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189/AnsiballZ_command.py 16142 1727204158.89024: Sending initial data 16142 1727204158.89027: Sent initial data (155 bytes) 16142 1727204158.92034: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204158.92057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.92078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.92116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.92158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.92212: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204158.92227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.92372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204158.92384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204158.92394: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204158.92405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.92452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.93132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.93146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.93158: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204158.93174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.93403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.93420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.93437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.93677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204158.95334: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204158.95373: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204158.95414: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp4qctqxvi /root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189/AnsiballZ_command.py <<< 16142 1727204158.95438: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204158.96725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204158.96810: stderr chunk (state=3): >>><<< 16142 1727204158.96814: stdout chunk (state=3): >>><<< 16142 1727204158.96839: done transferring module to remote 16142 1727204158.96852: _low_level_execute_command(): starting 16142 1727204158.96857: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189/ /root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189/AnsiballZ_command.py && sleep 0' 16142 1727204158.98963: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204158.98976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.98988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.99002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.99046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.99052: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204158.99062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.99078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204158.99086: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204158.99092: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204158.99100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204158.99109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204158.99121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204158.99127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204158.99134: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204158.99144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204158.99219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204158.99236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204158.99252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204158.99324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.01189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204159.01193: stdout chunk (state=3): >>><<< 16142 1727204159.01197: stderr chunk (state=3): >>><<< 16142 1727204159.01219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204159.01223: _low_level_execute_command(): starting 16142 1727204159.01225: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189/AnsiballZ_command.py && sleep 0' 16142 1727204159.02566: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204159.03284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.03293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.03307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.03351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.03358: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204159.03371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.03383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204159.03391: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204159.03398: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204159.03404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.03413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.03427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.03430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.03435: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204159.03449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.03522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204159.03545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204159.03558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204159.03636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.22769: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:59.168121", "end": "2024-09-24 14:55:59.226396", "delta": "0:00:00.058275", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204159.24541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204159.24546: stdout chunk (state=3): >>><<< 16142 1727204159.24551: stderr chunk (state=3): >>><<< 16142 1727204159.24572: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:59.168121", "end": "2024-09-24 14:55:59.226396", "delta": "0:00:00.058275", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204159.24614: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204159.24622: _low_level_execute_command(): starting 16142 1727204159.24628: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204158.841752-20528-234377907625189/ > /dev/null 2>&1 && sleep 0' 16142 1727204159.25979: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.25983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.26033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.26037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.26050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.26054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.26144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204159.26166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204159.26234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.28090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204159.28094: stderr chunk (state=3): >>><<< 16142 1727204159.28099: stdout chunk (state=3): >>><<< 16142 1727204159.28118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204159.28125: handler run complete 16142 1727204159.28155: Evaluated conditional (False): False 16142 1727204159.28165: attempt loop complete, returning result 16142 1727204159.28168: _execute() done 16142 1727204159.28171: dumping result to json 16142 1727204159.28178: done dumping result, returning 16142 1727204159.28191: done running TaskExecutor() for managed-node2/TASK: Remove test interfaces [0affcd87-79f5-fddd-f6c7-0000000001b5] 16142 1727204159.28193: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b5 16142 1727204159.28301: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b5 16142 1727204159.28304: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.058275", "end": "2024-09-24 14:55:59.226396", "rc": 0, "start": "2024-09-24 14:55:59.168121" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 16142 1727204159.28397: no more pending results, returning what we have 16142 1727204159.28401: results queue empty 16142 1727204159.28402: checking for any_errors_fatal 16142 1727204159.28416: done checking for any_errors_fatal 16142 1727204159.28416: checking for max_fail_percentage 16142 1727204159.28418: done checking for max_fail_percentage 16142 1727204159.28419: checking to see if all hosts have failed and the running result is not ok 16142 1727204159.28420: done checking to see if all hosts have failed 16142 1727204159.28421: getting the remaining hosts for this loop 16142 1727204159.28422: done getting the remaining hosts for this loop 16142 1727204159.28426: getting the next task for host managed-node2 16142 1727204159.28431: done getting next task for host managed-node2 16142 1727204159.28434: ^ task is: TASK: Stop dnsmasq/radvd services 16142 1727204159.28437: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204159.28442: getting variables 16142 1727204159.28444: in VariableManager get_vars() 16142 1727204159.28495: Calling all_inventory to load vars for managed-node2 16142 1727204159.28498: Calling groups_inventory to load vars for managed-node2 16142 1727204159.28500: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204159.28509: Calling all_plugins_play to load vars for managed-node2 16142 1727204159.28511: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204159.28518: Calling groups_plugins_play to load vars for managed-node2 16142 1727204159.32295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204159.36044: done with get_vars() 16142 1727204159.36185: done getting variables 16142 1727204159.36247: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.619) 0:00:58.539 ***** 16142 1727204159.36399: entering _queue_task() for managed-node2/shell 16142 1727204159.37039: worker is 1 (out of 1 available) 16142 1727204159.37166: exiting _queue_task() for managed-node2/shell 16142 1727204159.37178: done queuing things up, now waiting for results queue to drain 16142 1727204159.37179: waiting for pending results... 16142 1727204159.38064: running TaskExecutor() for managed-node2/TASK: Stop dnsmasq/radvd services 16142 1727204159.38526: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001b6 16142 1727204159.38554: variable 'ansible_search_path' from source: unknown 16142 1727204159.38597: variable 'ansible_search_path' from source: unknown 16142 1727204159.38674: calling self._execute() 16142 1727204159.38969: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204159.38982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204159.38999: variable 'omit' from source: magic vars 16142 1727204159.40042: variable 'ansible_distribution_major_version' from source: facts 16142 1727204159.40103: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204159.41970: variable 'omit' from source: magic vars 16142 1727204159.41975: variable 'omit' from source: magic vars 16142 1727204159.41978: variable 'omit' from source: magic vars 16142 1727204159.41981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204159.41984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204159.41986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204159.41988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204159.41990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204159.41993: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204159.41995: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204159.41997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204159.41999: Set connection var ansible_timeout to 10 16142 1727204159.42001: Set connection var ansible_connection to ssh 16142 1727204159.42003: Set connection var ansible_shell_type to sh 16142 1727204159.42006: Set connection var ansible_shell_executable to /bin/sh 16142 1727204159.42007: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204159.42009: Set connection var ansible_pipelining to False 16142 1727204159.42011: variable 'ansible_shell_executable' from source: unknown 16142 1727204159.42013: variable 'ansible_connection' from source: unknown 16142 1727204159.42016: variable 'ansible_module_compression' from source: unknown 16142 1727204159.42018: variable 'ansible_shell_type' from source: unknown 16142 1727204159.42020: variable 'ansible_shell_executable' from source: unknown 16142 1727204159.42022: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204159.42024: variable 'ansible_pipelining' from source: unknown 16142 1727204159.42026: variable 'ansible_timeout' from source: unknown 16142 1727204159.42027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204159.42030: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204159.42041: variable 'omit' from source: magic vars 16142 1727204159.42043: starting attempt loop 16142 1727204159.42046: running the handler 16142 1727204159.42148: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204159.42276: _low_level_execute_command(): starting 16142 1727204159.42283: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204159.44358: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204159.44373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.44385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.44488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.44526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.44533: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204159.44543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.44556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204159.44566: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204159.44575: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204159.44583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.44593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.44607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.44614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.44622: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204159.44631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.44905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204159.44921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204159.44929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204159.45113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.46681: stdout chunk (state=3): >>>/root <<< 16142 1727204159.46854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204159.46857: stdout chunk (state=3): >>><<< 16142 1727204159.46870: stderr chunk (state=3): >>><<< 16142 1727204159.46893: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204159.46906: _low_level_execute_command(): starting 16142 1727204159.46917: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632 `" && echo ansible-tmp-1727204159.4689379-20544-177797263687632="` echo /root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632 `" ) && sleep 0' 16142 1727204159.47993: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204159.48083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.48092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.48104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.48156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.48242: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204159.48251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.48266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204159.48276: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204159.48283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204159.48291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.48299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.48312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.48319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.48326: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204159.48335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.48411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204159.48469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204159.48473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204159.48730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.50592: stdout chunk (state=3): >>>ansible-tmp-1727204159.4689379-20544-177797263687632=/root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632 <<< 16142 1727204159.50777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204159.50781: stdout chunk (state=3): >>><<< 16142 1727204159.50788: stderr chunk (state=3): >>><<< 16142 1727204159.50819: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204159.4689379-20544-177797263687632=/root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204159.50853: variable 'ansible_module_compression' from source: unknown 16142 1727204159.50913: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204159.50953: variable 'ansible_facts' from source: unknown 16142 1727204159.51021: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632/AnsiballZ_command.py 16142 1727204159.51607: Sending initial data 16142 1727204159.51611: Sent initial data (156 bytes) 16142 1727204159.54410: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204159.54514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.54534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.54550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.54596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.54603: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204159.54625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.54636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204159.54646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204159.54651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204159.54674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.54678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.54700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.54703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.54719: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204159.54722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.54908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204159.54942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204159.54945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204159.55168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.56826: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204159.56866: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204159.56911: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp7g8unxet /root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632/AnsiballZ_command.py <<< 16142 1727204159.56946: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204159.58382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204159.58504: stderr chunk (state=3): >>><<< 16142 1727204159.58523: stdout chunk (state=3): >>><<< 16142 1727204159.58561: done transferring module to remote 16142 1727204159.58567: _low_level_execute_command(): starting 16142 1727204159.58571: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632/ /root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632/AnsiballZ_command.py && sleep 0' 16142 1727204159.60516: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204159.60524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.60544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.60557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.60708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.60711: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204159.60734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.60737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204159.60745: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204159.60759: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204159.60768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.60789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.60798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.60806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.60813: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204159.60824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.60938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204159.61107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204159.61111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204159.61271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.62985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204159.63041: stderr chunk (state=3): >>><<< 16142 1727204159.63046: stdout chunk (state=3): >>><<< 16142 1727204159.63070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204159.63073: _low_level_execute_command(): starting 16142 1727204159.63079: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632/AnsiballZ_command.py && sleep 0' 16142 1727204159.65290: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204159.65311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.65330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.65333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.65381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.65460: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204159.65480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.65489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204159.65494: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204159.65501: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204159.65509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.65580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.65584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.65587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.65589: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204159.65591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.65659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204159.65787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204159.65804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204159.66022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.81234: stdout chunk (state=3): >>> <<< 16142 1727204159.81284: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:59.790542", "end": "2024-09-24 14:55:59.811433", "delta": "0:00:00.020891", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204159.82575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204159.82579: stdout chunk (state=3): >>><<< 16142 1727204159.82588: stderr chunk (state=3): >>><<< 16142 1727204159.82615: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:59.790542", "end": "2024-09-24 14:55:59.811433", "delta": "0:00:00.020891", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204159.82655: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204159.82663: _low_level_execute_command(): starting 16142 1727204159.82671: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204159.4689379-20544-177797263687632/ > /dev/null 2>&1 && sleep 0' 16142 1727204159.84135: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204159.84285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.84295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.84310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.84396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.84402: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204159.84412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.84498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204159.84509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204159.84518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204159.84523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204159.84535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204159.84548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204159.84555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204159.84562: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204159.84575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204159.84653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204159.84715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204159.84719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204159.84842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204159.86682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204159.86686: stdout chunk (state=3): >>><<< 16142 1727204159.86692: stderr chunk (state=3): >>><<< 16142 1727204159.86708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204159.86715: handler run complete 16142 1727204159.86741: Evaluated conditional (False): False 16142 1727204159.86747: attempt loop complete, returning result 16142 1727204159.86750: _execute() done 16142 1727204159.86753: dumping result to json 16142 1727204159.86758: done dumping result, returning 16142 1727204159.86769: done running TaskExecutor() for managed-node2/TASK: Stop dnsmasq/radvd services [0affcd87-79f5-fddd-f6c7-0000000001b6] 16142 1727204159.86775: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b6 16142 1727204159.86888: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b6 16142 1727204159.86891: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.020891", "end": "2024-09-24 14:55:59.811433", "rc": 0, "start": "2024-09-24 14:55:59.790542" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 16142 1727204159.86986: no more pending results, returning what we have 16142 1727204159.86990: results queue empty 16142 1727204159.86991: checking for any_errors_fatal 16142 1727204159.87000: done checking for any_errors_fatal 16142 1727204159.87001: checking for max_fail_percentage 16142 1727204159.87003: done checking for max_fail_percentage 16142 1727204159.87004: checking to see if all hosts have failed and the running result is not ok 16142 1727204159.87005: done checking to see if all hosts have failed 16142 1727204159.87005: getting the remaining hosts for this loop 16142 1727204159.87007: done getting the remaining hosts for this loop 16142 1727204159.87010: getting the next task for host managed-node2 16142 1727204159.87017: done getting next task for host managed-node2 16142 1727204159.87020: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 16142 1727204159.87023: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204159.87028: getting variables 16142 1727204159.87030: in VariableManager get_vars() 16142 1727204159.87079: Calling all_inventory to load vars for managed-node2 16142 1727204159.87082: Calling groups_inventory to load vars for managed-node2 16142 1727204159.87084: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204159.87094: Calling all_plugins_play to load vars for managed-node2 16142 1727204159.87096: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204159.87099: Calling groups_plugins_play to load vars for managed-node2 16142 1727204159.89537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204159.91431: done with get_vars() 16142 1727204159.91461: done getting variables 16142 1727204159.91538: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.552) 0:00:59.092 ***** 16142 1727204159.91574: entering _queue_task() for managed-node2/command 16142 1727204159.91948: worker is 1 (out of 1 available) 16142 1727204159.91961: exiting _queue_task() for managed-node2/command 16142 1727204159.91974: done queuing things up, now waiting for results queue to drain 16142 1727204159.91975: waiting for pending results... 16142 1727204159.92298: running TaskExecutor() for managed-node2/TASK: Restore the /etc/resolv.conf for initscript 16142 1727204159.92403: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001b7 16142 1727204159.92420: variable 'ansible_search_path' from source: unknown 16142 1727204159.92463: calling self._execute() 16142 1727204159.92582: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204159.92592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204159.92602: variable 'omit' from source: magic vars 16142 1727204159.93040: variable 'ansible_distribution_major_version' from source: facts 16142 1727204159.93050: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204159.93177: variable 'network_provider' from source: set_fact 16142 1727204159.93188: Evaluated conditional (network_provider == "initscripts"): False 16142 1727204159.93191: when evaluation is False, skipping this task 16142 1727204159.93194: _execute() done 16142 1727204159.93198: dumping result to json 16142 1727204159.93206: done dumping result, returning 16142 1727204159.93213: done running TaskExecutor() for managed-node2/TASK: Restore the /etc/resolv.conf for initscript [0affcd87-79f5-fddd-f6c7-0000000001b7] 16142 1727204159.93220: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b7 16142 1727204159.93319: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b7 16142 1727204159.93323: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16142 1727204159.93375: no more pending results, returning what we have 16142 1727204159.93379: results queue empty 16142 1727204159.93380: checking for any_errors_fatal 16142 1727204159.93392: done checking for any_errors_fatal 16142 1727204159.93393: checking for max_fail_percentage 16142 1727204159.93395: done checking for max_fail_percentage 16142 1727204159.93396: checking to see if all hosts have failed and the running result is not ok 16142 1727204159.93397: done checking to see if all hosts have failed 16142 1727204159.93398: getting the remaining hosts for this loop 16142 1727204159.93399: done getting the remaining hosts for this loop 16142 1727204159.93403: getting the next task for host managed-node2 16142 1727204159.93412: done getting next task for host managed-node2 16142 1727204159.93416: ^ task is: TASK: Verify network state restored to default 16142 1727204159.93420: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204159.93425: getting variables 16142 1727204159.93426: in VariableManager get_vars() 16142 1727204159.93487: Calling all_inventory to load vars for managed-node2 16142 1727204159.93490: Calling groups_inventory to load vars for managed-node2 16142 1727204159.93492: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204159.93505: Calling all_plugins_play to load vars for managed-node2 16142 1727204159.93508: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204159.93511: Calling groups_plugins_play to load vars for managed-node2 16142 1727204159.95223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204159.97001: done with get_vars() 16142 1727204159.97042: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.055) 0:00:59.148 ***** 16142 1727204159.97151: entering _queue_task() for managed-node2/include_tasks 16142 1727204159.97517: worker is 1 (out of 1 available) 16142 1727204159.97530: exiting _queue_task() for managed-node2/include_tasks 16142 1727204159.97541: done queuing things up, now waiting for results queue to drain 16142 1727204159.97542: waiting for pending results... 16142 1727204159.97885: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 16142 1727204159.97997: in run() - task 0affcd87-79f5-fddd-f6c7-0000000001b8 16142 1727204159.98024: variable 'ansible_search_path' from source: unknown 16142 1727204159.98066: calling self._execute() 16142 1727204159.98288: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204159.98292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204159.98302: variable 'omit' from source: magic vars 16142 1727204159.99110: variable 'ansible_distribution_major_version' from source: facts 16142 1727204159.99122: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204159.99129: _execute() done 16142 1727204159.99133: dumping result to json 16142 1727204159.99140: done dumping result, returning 16142 1727204159.99146: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [0affcd87-79f5-fddd-f6c7-0000000001b8] 16142 1727204159.99153: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b8 16142 1727204159.99257: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000001b8 16142 1727204159.99261: WORKER PROCESS EXITING 16142 1727204159.99293: no more pending results, returning what we have 16142 1727204159.99299: in VariableManager get_vars() 16142 1727204159.99375: Calling all_inventory to load vars for managed-node2 16142 1727204159.99378: Calling groups_inventory to load vars for managed-node2 16142 1727204159.99381: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204159.99396: Calling all_plugins_play to load vars for managed-node2 16142 1727204159.99399: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204159.99403: Calling groups_plugins_play to load vars for managed-node2 16142 1727204160.01593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204160.04099: done with get_vars() 16142 1727204160.04133: variable 'ansible_search_path' from source: unknown 16142 1727204160.04165: we have included files to process 16142 1727204160.04169: generating all_blocks data 16142 1727204160.04173: done generating all_blocks data 16142 1727204160.04179: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 16142 1727204160.04187: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 16142 1727204160.04191: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 16142 1727204160.04704: done processing included file 16142 1727204160.04707: iterating over new_blocks loaded from include file 16142 1727204160.04708: in VariableManager get_vars() 16142 1727204160.04750: done with get_vars() 16142 1727204160.04752: filtering new block on tags 16142 1727204160.04800: done filtering new block on tags 16142 1727204160.04803: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 16142 1727204160.04808: extending task lists for all hosts with included blocks 16142 1727204160.07626: done extending task lists 16142 1727204160.07628: done processing included files 16142 1727204160.07762: results queue empty 16142 1727204160.07765: checking for any_errors_fatal 16142 1727204160.07772: done checking for any_errors_fatal 16142 1727204160.07773: checking for max_fail_percentage 16142 1727204160.07775: done checking for max_fail_percentage 16142 1727204160.07776: checking to see if all hosts have failed and the running result is not ok 16142 1727204160.07793: done checking to see if all hosts have failed 16142 1727204160.07794: getting the remaining hosts for this loop 16142 1727204160.07796: done getting the remaining hosts for this loop 16142 1727204160.07799: getting the next task for host managed-node2 16142 1727204160.07804: done getting next task for host managed-node2 16142 1727204160.07807: ^ task is: TASK: Check routes and DNS 16142 1727204160.07810: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204160.07813: getting variables 16142 1727204160.07814: in VariableManager get_vars() 16142 1727204160.07842: Calling all_inventory to load vars for managed-node2 16142 1727204160.07845: Calling groups_inventory to load vars for managed-node2 16142 1727204160.07847: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204160.07854: Calling all_plugins_play to load vars for managed-node2 16142 1727204160.07856: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204160.07859: Calling groups_plugins_play to load vars for managed-node2 16142 1727204160.11519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204160.13923: done with get_vars() 16142 1727204160.13957: done getting variables 16142 1727204160.14016: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.169) 0:00:59.317 ***** 16142 1727204160.14051: entering _queue_task() for managed-node2/shell 16142 1727204160.15808: worker is 1 (out of 1 available) 16142 1727204160.15821: exiting _queue_task() for managed-node2/shell 16142 1727204160.15832: done queuing things up, now waiting for results queue to drain 16142 1727204160.15833: waiting for pending results... 16142 1727204160.16898: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 16142 1727204160.17143: in run() - task 0affcd87-79f5-fddd-f6c7-0000000009f0 16142 1727204160.17170: variable 'ansible_search_path' from source: unknown 16142 1727204160.17279: variable 'ansible_search_path' from source: unknown 16142 1727204160.17324: calling self._execute() 16142 1727204160.17543: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204160.17552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204160.17566: variable 'omit' from source: magic vars 16142 1727204160.18288: variable 'ansible_distribution_major_version' from source: facts 16142 1727204160.18385: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204160.18396: variable 'omit' from source: magic vars 16142 1727204160.18455: variable 'omit' from source: magic vars 16142 1727204160.18613: variable 'omit' from source: magic vars 16142 1727204160.18660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204160.18811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204160.18843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204160.18869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204160.18921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204160.18958: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204160.19022: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204160.19030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204160.19255: Set connection var ansible_timeout to 10 16142 1727204160.19262: Set connection var ansible_connection to ssh 16142 1727204160.19275: Set connection var ansible_shell_type to sh 16142 1727204160.19352: Set connection var ansible_shell_executable to /bin/sh 16142 1727204160.19365: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204160.19378: Set connection var ansible_pipelining to False 16142 1727204160.19405: variable 'ansible_shell_executable' from source: unknown 16142 1727204160.19413: variable 'ansible_connection' from source: unknown 16142 1727204160.19419: variable 'ansible_module_compression' from source: unknown 16142 1727204160.19425: variable 'ansible_shell_type' from source: unknown 16142 1727204160.19431: variable 'ansible_shell_executable' from source: unknown 16142 1727204160.19455: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204160.19463: variable 'ansible_pipelining' from source: unknown 16142 1727204160.19567: variable 'ansible_timeout' from source: unknown 16142 1727204160.19577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204160.19837: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204160.19853: variable 'omit' from source: magic vars 16142 1727204160.19863: starting attempt loop 16142 1727204160.19873: running the handler 16142 1727204160.19893: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204160.19917: _low_level_execute_command(): starting 16142 1727204160.19980: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204160.22130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.22250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.22254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.22296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204160.22301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.22304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.22494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.22597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.22676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.24293: stdout chunk (state=3): >>>/root <<< 16142 1727204160.24398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.24490: stderr chunk (state=3): >>><<< 16142 1727204160.24493: stdout chunk (state=3): >>><<< 16142 1727204160.24611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204160.24615: _low_level_execute_command(): starting 16142 1727204160.24618: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856 `" && echo ansible-tmp-1727204160.2451825-20575-222981915858856="` echo /root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856 `" ) && sleep 0' 16142 1727204160.26184: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.26188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.26221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 16142 1727204160.26225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.26229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.26334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 16142 1727204160.26337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.26493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.26562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.26655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.28561: stdout chunk (state=3): >>>ansible-tmp-1727204160.2451825-20575-222981915858856=/root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856 <<< 16142 1727204160.28698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.28752: stderr chunk (state=3): >>><<< 16142 1727204160.28756: stdout chunk (state=3): >>><<< 16142 1727204160.29071: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204160.2451825-20575-222981915858856=/root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204160.29075: variable 'ansible_module_compression' from source: unknown 16142 1727204160.29078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204160.29080: variable 'ansible_facts' from source: unknown 16142 1727204160.29083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856/AnsiballZ_command.py 16142 1727204160.29625: Sending initial data 16142 1727204160.29628: Sent initial data (156 bytes) 16142 1727204160.32344: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204160.32485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.32522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.32545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.32612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.32695: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204160.32711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.32730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204160.32743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204160.32767: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204160.32787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.32827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.32845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.32921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.32935: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204160.32950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.33147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204160.33173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.33191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.33265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.35030: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204160.35052: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204160.35122: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmp9z4mrubx /root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856/AnsiballZ_command.py <<< 16142 1727204160.35125: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204160.36510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.36690: stderr chunk (state=3): >>><<< 16142 1727204160.36694: stdout chunk (state=3): >>><<< 16142 1727204160.36696: done transferring module to remote 16142 1727204160.36698: _low_level_execute_command(): starting 16142 1727204160.36700: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856/ /root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856/AnsiballZ_command.py && sleep 0' 16142 1727204160.38477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204160.38492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.38505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.38523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.38577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.38658: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204160.38675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.38692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204160.38703: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204160.38714: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204160.38725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.38737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.38759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.38778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.38791: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204160.38803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.39000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204160.39021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.39036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.39115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.40921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.40925: stdout chunk (state=3): >>><<< 16142 1727204160.40928: stderr chunk (state=3): >>><<< 16142 1727204160.41032: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204160.41035: _low_level_execute_command(): starting 16142 1727204160.41038: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856/AnsiballZ_command.py && sleep 0' 16142 1727204160.42041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.42045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.42084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.42087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.42090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.42175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.42188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.42347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.56266: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3396sec preferred_lft 3396sec\n inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:56:00.553274", "end": "2024-09-24 14:56:00.561667", "delta": "0:00:00.008393", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204160.57414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204160.57487: stderr chunk (state=3): >>><<< 16142 1727204160.57490: stdout chunk (state=3): >>><<< 16142 1727204160.57642: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3396sec preferred_lft 3396sec\n inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:56:00.553274", "end": "2024-09-24 14:56:00.561667", "delta": "0:00:00.008393", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204160.57652: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204160.57655: _low_level_execute_command(): starting 16142 1727204160.57657: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204160.2451825-20575-222981915858856/ > /dev/null 2>&1 && sleep 0' 16142 1727204160.59012: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204160.59181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.59285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.59305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.59481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.59498: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204160.59513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.59532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204160.59549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204160.59567: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204160.59582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.59597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.59620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.59633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.59647: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204160.59662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.59817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204160.59849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.59869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.59953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.61792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.61826: stderr chunk (state=3): >>><<< 16142 1727204160.61829: stdout chunk (state=3): >>><<< 16142 1727204160.62076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204160.62079: handler run complete 16142 1727204160.62081: Evaluated conditional (False): False 16142 1727204160.62083: attempt loop complete, returning result 16142 1727204160.62085: _execute() done 16142 1727204160.62087: dumping result to json 16142 1727204160.62088: done dumping result, returning 16142 1727204160.62090: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [0affcd87-79f5-fddd-f6c7-0000000009f0] 16142 1727204160.62092: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000009f0 16142 1727204160.62168: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000009f0 16142 1727204160.62171: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008393", "end": "2024-09-24 14:56:00.561667", "rc": 0, "start": "2024-09-24 14:56:00.553274" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3396sec preferred_lft 3396sec inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 16142 1727204160.62252: no more pending results, returning what we have 16142 1727204160.62258: results queue empty 16142 1727204160.62259: checking for any_errors_fatal 16142 1727204160.62261: done checking for any_errors_fatal 16142 1727204160.62262: checking for max_fail_percentage 16142 1727204160.62271: done checking for max_fail_percentage 16142 1727204160.62272: checking to see if all hosts have failed and the running result is not ok 16142 1727204160.62274: done checking to see if all hosts have failed 16142 1727204160.62274: getting the remaining hosts for this loop 16142 1727204160.62276: done getting the remaining hosts for this loop 16142 1727204160.62281: getting the next task for host managed-node2 16142 1727204160.62290: done getting next task for host managed-node2 16142 1727204160.62293: ^ task is: TASK: Verify DNS and network connectivity 16142 1727204160.62297: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 16142 1727204160.62307: getting variables 16142 1727204160.62309: in VariableManager get_vars() 16142 1727204160.62375: Calling all_inventory to load vars for managed-node2 16142 1727204160.62379: Calling groups_inventory to load vars for managed-node2 16142 1727204160.62381: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204160.62393: Calling all_plugins_play to load vars for managed-node2 16142 1727204160.62396: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204160.62400: Calling groups_plugins_play to load vars for managed-node2 16142 1727204160.65481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204160.67473: done with get_vars() 16142 1727204160.67511: done getting variables 16142 1727204160.67583: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.535) 0:00:59.853 ***** 16142 1727204160.67624: entering _queue_task() for managed-node2/shell 16142 1727204160.68009: worker is 1 (out of 1 available) 16142 1727204160.68023: exiting _queue_task() for managed-node2/shell 16142 1727204160.68048: done queuing things up, now waiting for results queue to drain 16142 1727204160.68049: waiting for pending results... 16142 1727204160.68364: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 16142 1727204160.68540: in run() - task 0affcd87-79f5-fddd-f6c7-0000000009f1 16142 1727204160.68562: variable 'ansible_search_path' from source: unknown 16142 1727204160.68573: variable 'ansible_search_path' from source: unknown 16142 1727204160.68720: calling self._execute() 16142 1727204160.68855: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204160.68868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204160.68882: variable 'omit' from source: magic vars 16142 1727204160.69297: variable 'ansible_distribution_major_version' from source: facts 16142 1727204160.69314: Evaluated conditional (ansible_distribution_major_version != '6'): True 16142 1727204160.69473: variable 'ansible_facts' from source: unknown 16142 1727204160.70356: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 16142 1727204160.70369: variable 'omit' from source: magic vars 16142 1727204160.70417: variable 'omit' from source: magic vars 16142 1727204160.70475: variable 'omit' from source: magic vars 16142 1727204160.70524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16142 1727204160.70605: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16142 1727204160.70670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16142 1727204160.70720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204160.70735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16142 1727204160.70776: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16142 1727204160.70805: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204160.70821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204160.70966: Set connection var ansible_timeout to 10 16142 1727204160.70975: Set connection var ansible_connection to ssh 16142 1727204160.70985: Set connection var ansible_shell_type to sh 16142 1727204160.70995: Set connection var ansible_shell_executable to /bin/sh 16142 1727204160.71013: Set connection var ansible_module_compression to ZIP_DEFLATED 16142 1727204160.71026: Set connection var ansible_pipelining to False 16142 1727204160.71058: variable 'ansible_shell_executable' from source: unknown 16142 1727204160.71069: variable 'ansible_connection' from source: unknown 16142 1727204160.71077: variable 'ansible_module_compression' from source: unknown 16142 1727204160.71083: variable 'ansible_shell_type' from source: unknown 16142 1727204160.71089: variable 'ansible_shell_executable' from source: unknown 16142 1727204160.71095: variable 'ansible_host' from source: host vars for 'managed-node2' 16142 1727204160.71102: variable 'ansible_pipelining' from source: unknown 16142 1727204160.71108: variable 'ansible_timeout' from source: unknown 16142 1727204160.71125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16142 1727204160.71290: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204160.71306: variable 'omit' from source: magic vars 16142 1727204160.71317: starting attempt loop 16142 1727204160.71324: running the handler 16142 1727204160.71352: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 16142 1727204160.71380: _low_level_execute_command(): starting 16142 1727204160.71394: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16142 1727204160.72250: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204160.72269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.72287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.72307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.72367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.72381: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204160.72397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.72419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204160.72443: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204160.72456: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204160.72472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.72487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.72505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.72519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.72539: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204160.72559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.72643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204160.72677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.72696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.72779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.74334: stdout chunk (state=3): >>>/root <<< 16142 1727204160.74547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.74552: stdout chunk (state=3): >>><<< 16142 1727204160.74555: stderr chunk (state=3): >>><<< 16142 1727204160.74687: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204160.74698: _low_level_execute_command(): starting 16142 1727204160.74701: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163 `" && echo ansible-tmp-1727204160.7459106-20597-132313475031163="` echo /root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163 `" ) && sleep 0' 16142 1727204160.75317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204160.75346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.75362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.75385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.75425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.75442: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204160.75469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.75487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204160.75500: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204160.75512: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204160.75524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.75541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.75571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.75587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.75598: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204160.75612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.75702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204160.75724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.75743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.75821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.77653: stdout chunk (state=3): >>>ansible-tmp-1727204160.7459106-20597-132313475031163=/root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163 <<< 16142 1727204160.77781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.77871: stderr chunk (state=3): >>><<< 16142 1727204160.77879: stdout chunk (state=3): >>><<< 16142 1727204160.77984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204160.7459106-20597-132313475031163=/root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204160.77987: variable 'ansible_module_compression' from source: unknown 16142 1727204160.78074: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16142r2pfd04r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16142 1727204160.78170: variable 'ansible_facts' from source: unknown 16142 1727204160.78196: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163/AnsiballZ_command.py 16142 1727204160.78456: Sending initial data 16142 1727204160.78461: Sent initial data (156 bytes) 16142 1727204160.81917: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204160.81925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.81939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.81951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.81997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.82001: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204160.82012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.82053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204160.82087: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204160.82122: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204160.82160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.82171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.82184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.82231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.82243: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204160.82266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.82381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204160.82396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.82400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.82587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.84215: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16142 1727204160.84254: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 16142 1727204160.84292: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16142r2pfd04r/tmpahoibb59 /root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163/AnsiballZ_command.py <<< 16142 1727204160.84329: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 16142 1727204160.85788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.85976: stderr chunk (state=3): >>><<< 16142 1727204160.85981: stdout chunk (state=3): >>><<< 16142 1727204160.85984: done transferring module to remote 16142 1727204160.85998: _low_level_execute_command(): starting 16142 1727204160.86001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163/ /root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163/AnsiballZ_command.py && sleep 0' 16142 1727204160.89189: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204160.89284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.89294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.89309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.89350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.89430: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204160.89459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.89490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204160.89493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204160.89514: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204160.89522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.89695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.89707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.89717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.89725: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204160.89741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.89988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204160.90075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.90079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.90358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204160.92089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204160.92092: stdout chunk (state=3): >>><<< 16142 1727204160.92109: stderr chunk (state=3): >>><<< 16142 1727204160.92145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204160.92149: _low_level_execute_command(): starting 16142 1727204160.92154: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163/AnsiballZ_command.py && sleep 0' 16142 1727204160.93304: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204160.93311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.93322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.93335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.93403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.93410: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204160.93419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.93433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204160.93442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204160.93449: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204160.93455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204160.93466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204160.93479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204160.93488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204160.93491: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204160.93501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204160.94129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204160.94132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204160.94314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204160.94562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204161.30673: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3961 0 --:--:-- --:--:-- --:--:-- 4013\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2328 0 --:--:-- --:--:-- --:--:-- 2328", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:56:01.075704", "end": "2024-09-24 14:56:01.302389", "delta": "0:00:00.226685", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16142 1727204161.31688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 16142 1727204161.31774: stderr chunk (state=3): >>><<< 16142 1727204161.31778: stdout chunk (state=3): >>><<< 16142 1727204161.31801: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3961 0 --:--:-- --:--:-- --:--:-- 4013\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2328 0 --:--:-- --:--:-- --:--:-- 2328", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:56:01.075704", "end": "2024-09-24 14:56:01.302389", "delta": "0:00:00.226685", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 16142 1727204161.31851: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16142 1727204161.31859: _low_level_execute_command(): starting 16142 1727204161.31867: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204160.7459106-20597-132313475031163/ > /dev/null 2>&1 && sleep 0' 16142 1727204161.33221: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 16142 1727204161.33236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204161.33253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204161.33275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204161.33328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204161.33341: stderr chunk (state=3): >>>debug2: match not found <<< 16142 1727204161.33356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204161.33377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16142 1727204161.33390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 16142 1727204161.33408: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16142 1727204161.33421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16142 1727204161.33434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16142 1727204161.33450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16142 1727204161.33463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 16142 1727204161.33478: stderr chunk (state=3): >>>debug2: match found <<< 16142 1727204161.33493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16142 1727204161.33575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16142 1727204161.33599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16142 1727204161.33621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16142 1727204161.33695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16142 1727204161.35618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16142 1727204161.35626: stdout chunk (state=3): >>><<< 16142 1727204161.35628: stderr chunk (state=3): >>><<< 16142 1727204161.35673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16142 1727204161.35677: handler run complete 16142 1727204161.35775: Evaluated conditional (False): False 16142 1727204161.35778: attempt loop complete, returning result 16142 1727204161.35780: _execute() done 16142 1727204161.35783: dumping result to json 16142 1727204161.35785: done dumping result, returning 16142 1727204161.35786: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [0affcd87-79f5-fddd-f6c7-0000000009f1] 16142 1727204161.35788: sending task result for task 0affcd87-79f5-fddd-f6c7-0000000009f1 ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.226685", "end": "2024-09-24 14:56:01.302389", "rc": 0, "start": "2024-09-24 14:56:01.075704" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3961 0 --:--:-- --:--:-- --:--:-- 4013 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2328 0 --:--:-- --:--:-- --:--:-- 2328 16142 1727204161.35932: no more pending results, returning what we have 16142 1727204161.35935: results queue empty 16142 1727204161.35938: checking for any_errors_fatal 16142 1727204161.35948: done checking for any_errors_fatal 16142 1727204161.35949: checking for max_fail_percentage 16142 1727204161.35950: done checking for max_fail_percentage 16142 1727204161.35951: checking to see if all hosts have failed and the running result is not ok 16142 1727204161.35952: done checking to see if all hosts have failed 16142 1727204161.35953: getting the remaining hosts for this loop 16142 1727204161.35954: done getting the remaining hosts for this loop 16142 1727204161.35958: getting the next task for host managed-node2 16142 1727204161.35969: done getting next task for host managed-node2 16142 1727204161.35972: ^ task is: TASK: meta (flush_handlers) 16142 1727204161.35974: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204161.35979: getting variables 16142 1727204161.35981: in VariableManager get_vars() 16142 1727204161.36035: Calling all_inventory to load vars for managed-node2 16142 1727204161.36040: Calling groups_inventory to load vars for managed-node2 16142 1727204161.36043: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204161.36054: Calling all_plugins_play to load vars for managed-node2 16142 1727204161.36057: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204161.36060: Calling groups_plugins_play to load vars for managed-node2 16142 1727204161.36593: done sending task result for task 0affcd87-79f5-fddd-f6c7-0000000009f1 16142 1727204161.36596: WORKER PROCESS EXITING 16142 1727204161.38095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204161.40193: done with get_vars() 16142 1727204161.40234: done getting variables 16142 1727204161.40308: in VariableManager get_vars() 16142 1727204161.40337: Calling all_inventory to load vars for managed-node2 16142 1727204161.40340: Calling groups_inventory to load vars for managed-node2 16142 1727204161.40342: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204161.40348: Calling all_plugins_play to load vars for managed-node2 16142 1727204161.40350: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204161.40353: Calling groups_plugins_play to load vars for managed-node2 16142 1727204161.42131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204161.44386: done with get_vars() 16142 1727204161.44430: done queuing things up, now waiting for results queue to drain 16142 1727204161.44432: results queue empty 16142 1727204161.44433: checking for any_errors_fatal 16142 1727204161.44440: done checking for any_errors_fatal 16142 1727204161.44441: checking for max_fail_percentage 16142 1727204161.44442: done checking for max_fail_percentage 16142 1727204161.44443: checking to see if all hosts have failed and the running result is not ok 16142 1727204161.44444: done checking to see if all hosts have failed 16142 1727204161.44445: getting the remaining hosts for this loop 16142 1727204161.44446: done getting the remaining hosts for this loop 16142 1727204161.44448: getting the next task for host managed-node2 16142 1727204161.44453: done getting next task for host managed-node2 16142 1727204161.44454: ^ task is: TASK: meta (flush_handlers) 16142 1727204161.44456: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204161.44468: getting variables 16142 1727204161.44469: in VariableManager get_vars() 16142 1727204161.44494: Calling all_inventory to load vars for managed-node2 16142 1727204161.44496: Calling groups_inventory to load vars for managed-node2 16142 1727204161.44499: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204161.44505: Calling all_plugins_play to load vars for managed-node2 16142 1727204161.44507: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204161.44510: Calling groups_plugins_play to load vars for managed-node2 16142 1727204161.47733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204161.49559: done with get_vars() 16142 1727204161.49592: done getting variables 16142 1727204161.49647: in VariableManager get_vars() 16142 1727204161.49681: Calling all_inventory to load vars for managed-node2 16142 1727204161.49684: Calling groups_inventory to load vars for managed-node2 16142 1727204161.49686: Calling all_plugins_inventory to load vars for managed-node2 16142 1727204161.49692: Calling all_plugins_play to load vars for managed-node2 16142 1727204161.49694: Calling groups_plugins_inventory to load vars for managed-node2 16142 1727204161.49697: Calling groups_plugins_play to load vars for managed-node2 16142 1727204161.52427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16142 1727204161.54696: done with get_vars() 16142 1727204161.54730: done queuing things up, now waiting for results queue to drain 16142 1727204161.54733: results queue empty 16142 1727204161.54734: checking for any_errors_fatal 16142 1727204161.54735: done checking for any_errors_fatal 16142 1727204161.54736: checking for max_fail_percentage 16142 1727204161.54740: done checking for max_fail_percentage 16142 1727204161.54740: checking to see if all hosts have failed and the running result is not ok 16142 1727204161.54741: done checking to see if all hosts have failed 16142 1727204161.54742: getting the remaining hosts for this loop 16142 1727204161.54743: done getting the remaining hosts for this loop 16142 1727204161.54746: getting the next task for host managed-node2 16142 1727204161.54750: done getting next task for host managed-node2 16142 1727204161.54751: ^ task is: None 16142 1727204161.54753: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16142 1727204161.54754: done queuing things up, now waiting for results queue to drain 16142 1727204161.54755: results queue empty 16142 1727204161.54755: checking for any_errors_fatal 16142 1727204161.54756: done checking for any_errors_fatal 16142 1727204161.54757: checking for max_fail_percentage 16142 1727204161.54758: done checking for max_fail_percentage 16142 1727204161.54758: checking to see if all hosts have failed and the running result is not ok 16142 1727204161.54759: done checking to see if all hosts have failed 16142 1727204161.54762: getting the next task for host managed-node2 16142 1727204161.54769: done getting next task for host managed-node2 16142 1727204161.54770: ^ task is: None 16142 1727204161.54772: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=109 changed=5 unreachable=0 failed=0 skipped=120 rescued=0 ignored=0 Tuesday 24 September 2024 14:56:01 -0400 (0:00:00.872) 0:01:00.725 ***** =============================================================================== Create test interfaces -------------------------------------------------- 1.84s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Install dnsmasq --------------------------------------------------------- 1.82s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Gathering Facts --------------------------------------------------------- 1.79s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.69s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.67s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.65s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.60s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.59s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.31s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.27s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install pgrep, sysctl --------------------------------------------------- 1.25s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.23s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.18s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.12s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.04s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Check if system is ostree ----------------------------------------------- 1.01s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.99s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.99s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.94s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.89s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 16142 1727204161.54954: RUNNING CLEANUP