12372 1727204072.08233: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 12372 1727204072.08569: Added group all to inventory 12372 1727204072.08571: Added group ungrouped to inventory 12372 1727204072.08575: Group all now contains ungrouped 12372 1727204072.08577: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 12372 1727204072.34020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 12372 1727204072.34220: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 12372 1727204072.34249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 12372 1727204072.34334: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 12372 1727204072.34647: Loaded config def from plugin (inventory/script) 12372 1727204072.34650: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 12372 1727204072.34704: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 12372 1727204072.35313: Loaded config def from plugin (inventory/yaml) 12372 1727204072.35319: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 12372 1727204072.35491: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 12372 1727204072.36471: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 12372 1727204072.36475: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 12372 1727204072.36479: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 12372 1727204072.36486: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 12372 1727204072.36494: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 12372 1727204072.36758: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 12372 1727204072.36941: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 12372 1727204072.36998: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 12372 1727204072.37121: group all already in inventory 12372 1727204072.37130: set inventory_file for managed-node1 12372 1727204072.37168: set inventory_dir for managed-node1 12372 1727204072.37170: Added host managed-node1 to inventory 12372 1727204072.37173: Added host managed-node1 to group all 12372 1727204072.37485: set ansible_host for managed-node1 12372 1727204072.37487: set ansible_ssh_extra_args for managed-node1 12372 1727204072.37495: set inventory_file for managed-node2 12372 1727204072.37500: set inventory_dir for managed-node2 12372 1727204072.37501: Added host managed-node2 to inventory 12372 1727204072.37503: Added host managed-node2 to group all 12372 1727204072.37505: set ansible_host for managed-node2 12372 1727204072.37506: set ansible_ssh_extra_args for managed-node2 12372 1727204072.37509: set inventory_file for managed-node3 12372 1727204072.37512: set inventory_dir for managed-node3 12372 1727204072.37514: Added host managed-node3 to inventory 12372 1727204072.37518: Added host managed-node3 to group all 12372 1727204072.37601: set ansible_host for managed-node3 12372 1727204072.37603: set ansible_ssh_extra_args for managed-node3 12372 1727204072.37607: Reconcile groups and hosts in inventory. 12372 1727204072.37613: Group ungrouped now contains managed-node1 12372 1727204072.37618: Group ungrouped now contains managed-node2 12372 1727204072.37620: Group ungrouped now contains managed-node3 12372 1727204072.37888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 12372 1727204072.38179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 12372 1727204072.38253: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 12372 1727204072.38344: Loaded config def from plugin (vars/host_group_vars) 12372 1727204072.38348: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 12372 1727204072.38357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 12372 1727204072.38367: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 12372 1727204072.38507: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 12372 1727204072.39226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204072.39413: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 12372 1727204072.39471: Loaded config def from plugin (connection/local) 12372 1727204072.39475: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 12372 1727204072.40949: Loaded config def from plugin (connection/paramiko_ssh) 12372 1727204072.40953: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 12372 1727204072.42852: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12372 1727204072.42919: Loaded config def from plugin (connection/psrp) 12372 1727204072.42924: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 12372 1727204072.45081: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12372 1727204072.45145: Loaded config def from plugin (connection/ssh) 12372 1727204072.45150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 12372 1727204072.49232: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12372 1727204072.49393: Loaded config def from plugin (connection/winrm) 12372 1727204072.49399: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 12372 1727204072.49442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 12372 1727204072.49530: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 12372 1727204072.49676: Loaded config def from plugin (shell/cmd) 12372 1727204072.49678: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 12372 1727204072.49715: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 12372 1727204072.49820: Loaded config def from plugin (shell/powershell) 12372 1727204072.49823: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 12372 1727204072.49892: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 12372 1727204072.50196: Loaded config def from plugin (shell/sh) 12372 1727204072.50199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 12372 1727204072.50247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 12372 1727204072.50431: Loaded config def from plugin (become/runas) 12372 1727204072.50435: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 12372 1727204072.50920: Loaded config def from plugin (become/su) 12372 1727204072.50923: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 12372 1727204072.51188: Loaded config def from plugin (become/sudo) 12372 1727204072.51296: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 12372 1727204072.51343: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml 12372 1727204072.51804: in VariableManager get_vars() 12372 1727204072.51869: done with get_vars() 12372 1727204072.52219: trying /usr/local/lib/python3.12/site-packages/ansible/modules 12372 1727204072.56630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 12372 1727204072.56786: in VariableManager get_vars() 12372 1727204072.56793: done with get_vars() 12372 1727204072.56796: variable 'playbook_dir' from source: magic vars 12372 1727204072.56797: variable 'ansible_playbook_python' from source: magic vars 12372 1727204072.56798: variable 'ansible_config_file' from source: magic vars 12372 1727204072.56799: variable 'groups' from source: magic vars 12372 1727204072.56800: variable 'omit' from source: magic vars 12372 1727204072.56801: variable 'ansible_version' from source: magic vars 12372 1727204072.56802: variable 'ansible_check_mode' from source: magic vars 12372 1727204072.56803: variable 'ansible_diff_mode' from source: magic vars 12372 1727204072.56804: variable 'ansible_forks' from source: magic vars 12372 1727204072.56805: variable 'ansible_inventory_sources' from source: magic vars 12372 1727204072.56806: variable 'ansible_skip_tags' from source: magic vars 12372 1727204072.56807: variable 'ansible_limit' from source: magic vars 12372 1727204072.56808: variable 'ansible_run_tags' from source: magic vars 12372 1727204072.56809: variable 'ansible_verbosity' from source: magic vars 12372 1727204072.56860: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 12372 1727204072.58557: in VariableManager get_vars() 12372 1727204072.58576: done with get_vars() 12372 1727204072.58586: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12372 1727204072.60298: in VariableManager get_vars() 12372 1727204072.60321: done with get_vars() 12372 1727204072.60336: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12372 1727204072.60486: in VariableManager get_vars() 12372 1727204072.60508: done with get_vars() 12372 1727204072.60783: in VariableManager get_vars() 12372 1727204072.60804: done with get_vars() 12372 1727204072.60814: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12372 1727204072.60976: in VariableManager get_vars() 12372 1727204072.61023: done with get_vars() 12372 1727204072.61460: in VariableManager get_vars() 12372 1727204072.61477: done with get_vars() 12372 1727204072.61483: variable 'omit' from source: magic vars 12372 1727204072.61510: variable 'omit' from source: magic vars 12372 1727204072.61564: in VariableManager get_vars() 12372 1727204072.61578: done with get_vars() 12372 1727204072.61647: in VariableManager get_vars() 12372 1727204072.61663: done with get_vars() 12372 1727204072.61710: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12372 1727204072.62042: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12372 1727204072.62239: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12372 1727204072.69209: in VariableManager get_vars() 12372 1727204072.69313: done with get_vars() 12372 1727204072.70581: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 12372 1727204072.70872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204072.74160: in VariableManager get_vars() 12372 1727204072.74191: done with get_vars() 12372 1727204072.74204: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12372 1727204072.74337: in VariableManager get_vars() 12372 1727204072.74363: done with get_vars() 12372 1727204072.74541: in VariableManager get_vars() 12372 1727204072.74563: done with get_vars() 12372 1727204072.74956: in VariableManager get_vars() 12372 1727204072.74977: done with get_vars() 12372 1727204072.74984: variable 'omit' from source: magic vars 12372 1727204072.75000: variable 'omit' from source: magic vars 12372 1727204072.75234: variable 'controller_profile' from source: play vars 12372 1727204072.75302: in VariableManager get_vars() 12372 1727204072.75323: done with get_vars() 12372 1727204072.75349: in VariableManager get_vars() 12372 1727204072.75374: done with get_vars() 12372 1727204072.75443: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12372 1727204072.75813: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12372 1727204072.75956: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12372 1727204072.76722: in VariableManager get_vars() 12372 1727204072.76759: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204072.81176: in VariableManager get_vars() 12372 1727204072.81279: done with get_vars() 12372 1727204072.81287: variable 'omit' from source: magic vars 12372 1727204072.81304: variable 'omit' from source: magic vars 12372 1727204072.81348: in VariableManager get_vars() 12372 1727204072.81493: done with get_vars() 12372 1727204072.81524: in VariableManager get_vars() 12372 1727204072.81550: done with get_vars() 12372 1727204072.81699: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12372 1727204072.81995: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12372 1727204072.82203: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12372 1727204072.83283: in VariableManager get_vars() 12372 1727204072.83367: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204072.86874: in VariableManager get_vars() 12372 1727204072.87050: done with get_vars() 12372 1727204072.87058: variable 'omit' from source: magic vars 12372 1727204072.87075: variable 'omit' from source: magic vars 12372 1727204072.87119: in VariableManager get_vars() 12372 1727204072.87276: done with get_vars() 12372 1727204072.87363: in VariableManager get_vars() 12372 1727204072.87392: done with get_vars() 12372 1727204072.87430: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12372 1727204072.87850: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12372 1727204072.87982: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12372 1727204072.88637: in VariableManager get_vars() 12372 1727204072.88678: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204072.92553: in VariableManager get_vars() 12372 1727204072.92659: done with get_vars() 12372 1727204072.92668: variable 'omit' from source: magic vars 12372 1727204072.92711: variable 'omit' from source: magic vars 12372 1727204072.92767: in VariableManager get_vars() 12372 1727204072.92799: done with get_vars() 12372 1727204072.92834: in VariableManager get_vars() 12372 1727204072.92864: done with get_vars() 12372 1727204072.92902: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12372 1727204072.93133: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12372 1727204072.93338: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12372 1727204072.94411: in VariableManager get_vars() 12372 1727204072.94487: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204072.97635: in VariableManager get_vars() 12372 1727204072.97675: done with get_vars() 12372 1727204072.97685: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12372 1727204072.98403: in VariableManager get_vars() 12372 1727204072.98441: done with get_vars() 12372 1727204072.98529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 12372 1727204072.98549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 12372 1727204072.98864: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 12372 1727204072.99131: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 12372 1727204072.99134: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 12372 1727204072.99173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 12372 1727204072.99205: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 12372 1727204072.99448: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 12372 1727204072.99536: Loaded config def from plugin (callback/default) 12372 1727204072.99539: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12372 1727204073.01084: Loaded config def from plugin (callback/junit) 12372 1727204073.01088: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12372 1727204073.01144: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 12372 1727204073.01238: Loaded config def from plugin (callback/minimal) 12372 1727204073.01241: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12372 1727204073.01291: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12372 1727204073.01373: Loaded config def from plugin (callback/tree) 12372 1727204073.01376: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 12372 1727204073.01538: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 12372 1727204073.01541: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_initscripts.yml *********************************** 2 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml 12372 1727204073.01575: in VariableManager get_vars() 12372 1727204073.01594: done with get_vars() 12372 1727204073.01600: in VariableManager get_vars() 12372 1727204073.01616: done with get_vars() 12372 1727204073.01621: variable 'omit' from source: magic vars 12372 1727204073.01676: in VariableManager get_vars() 12372 1727204073.01695: done with get_vars() 12372 1727204073.01720: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with initscripts as provider] *** 12372 1727204073.02432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 12372 1727204073.02527: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 12372 1727204073.02562: getting the remaining hosts for this loop 12372 1727204073.02564: done getting the remaining hosts for this loop 12372 1727204073.02567: getting the next task for host managed-node3 12372 1727204073.02572: done getting next task for host managed-node3 12372 1727204073.02574: ^ task is: TASK: Gathering Facts 12372 1727204073.02576: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204073.02579: getting variables 12372 1727204073.02580: in VariableManager get_vars() 12372 1727204073.02595: Calling all_inventory to load vars for managed-node3 12372 1727204073.02598: Calling groups_inventory to load vars for managed-node3 12372 1727204073.02602: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204073.02622: Calling all_plugins_play to load vars for managed-node3 12372 1727204073.02637: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204073.02641: Calling groups_plugins_play to load vars for managed-node3 12372 1727204073.02684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204073.02759: done with get_vars() 12372 1727204073.02767: done getting variables 12372 1727204073.02850: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:5 Tuesday 24 September 2024 14:54:33 -0400 (0:00:00.014) 0:00:00.014 ***** 12372 1727204073.02876: entering _queue_task() for managed-node3/gather_facts 12372 1727204073.02878: Creating lock for gather_facts 12372 1727204073.03310: worker is 1 (out of 1 available) 12372 1727204073.03324: exiting _queue_task() for managed-node3/gather_facts 12372 1727204073.03342: done queuing things up, now waiting for results queue to drain 12372 1727204073.03344: waiting for pending results... 12372 1727204073.03713: running TaskExecutor() for managed-node3/TASK: Gathering Facts 12372 1727204073.03722: in run() - task 12b410aa-8751-244a-02f9-0000000001bc 12372 1727204073.03726: variable 'ansible_search_path' from source: unknown 12372 1727204073.03751: calling self._execute() 12372 1727204073.03828: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204073.03846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204073.03860: variable 'omit' from source: magic vars 12372 1727204073.03985: variable 'omit' from source: magic vars 12372 1727204073.04032: variable 'omit' from source: magic vars 12372 1727204073.04085: variable 'omit' from source: magic vars 12372 1727204073.04144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12372 1727204073.04199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12372 1727204073.04229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12372 1727204073.04255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204073.04278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204073.04324: variable 'inventory_hostname' from source: host vars for 'managed-node3' 12372 1727204073.04382: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204073.04386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204073.04473: Set connection var ansible_connection to ssh 12372 1727204073.04502: Set connection var ansible_timeout to 10 12372 1727204073.04514: Set connection var ansible_module_compression to ZIP_DEFLATED 12372 1727204073.04529: Set connection var ansible_shell_executable to /bin/sh 12372 1727204073.04536: Set connection var ansible_shell_type to sh 12372 1727204073.04551: Set connection var ansible_pipelining to False 12372 1727204073.04580: variable 'ansible_shell_executable' from source: unknown 12372 1727204073.04588: variable 'ansible_connection' from source: unknown 12372 1727204073.04695: variable 'ansible_module_compression' from source: unknown 12372 1727204073.04699: variable 'ansible_shell_type' from source: unknown 12372 1727204073.04702: variable 'ansible_shell_executable' from source: unknown 12372 1727204073.04705: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204073.04707: variable 'ansible_pipelining' from source: unknown 12372 1727204073.04709: variable 'ansible_timeout' from source: unknown 12372 1727204073.04711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204073.04864: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12372 1727204073.04883: variable 'omit' from source: magic vars 12372 1727204073.04895: starting attempt loop 12372 1727204073.04903: running the handler 12372 1727204073.04934: variable 'ansible_facts' from source: unknown 12372 1727204073.05050: _low_level_execute_command(): starting 12372 1727204073.05054: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12372 1727204073.05780: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204073.05809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204073.05934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204073.05957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204073.06039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204073.08257: stdout chunk (state=3): >>>/root <<< 12372 1727204073.08261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204073.08264: stdout chunk (state=3): >>><<< 12372 1727204073.08266: stderr chunk (state=3): >>><<< 12372 1727204073.08270: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204073.08272: _low_level_execute_command(): starting 12372 1727204073.08275: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801 `" && echo ansible-tmp-1727204073.0815492-12672-26799053669801="` echo /root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801 `" ) && sleep 0' 12372 1727204073.09997: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204073.10128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12372 1727204073.10144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12372 1727204073.10155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204073.10461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12372 1727204073.10513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204073.10604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204073.12797: stdout chunk (state=3): >>>ansible-tmp-1727204073.0815492-12672-26799053669801=/root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801 <<< 12372 1727204073.12810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204073.12813: stderr chunk (state=3): >>><<< 12372 1727204073.12819: stdout chunk (state=3): >>><<< 12372 1727204073.12842: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204073.0815492-12672-26799053669801=/root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204073.12883: variable 'ansible_module_compression' from source: unknown 12372 1727204073.12943: ANSIBALLZ: Using generic lock for ansible.legacy.setup 12372 1727204073.12947: ANSIBALLZ: Acquiring lock 12372 1727204073.12950: ANSIBALLZ: Lock acquired: 140438070171200 12372 1727204073.12956: ANSIBALLZ: Creating module 12372 1727204073.55473: ANSIBALLZ: Writing module into payload 12372 1727204073.55679: ANSIBALLZ: Writing module 12372 1727204073.55722: ANSIBALLZ: Renaming module 12372 1727204073.55755: ANSIBALLZ: Done creating module 12372 1727204073.55800: variable 'ansible_facts' from source: unknown 12372 1727204073.55813: variable 'inventory_hostname' from source: host vars for 'managed-node3' 12372 1727204073.55828: _low_level_execute_command(): starting 12372 1727204073.55840: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 12372 1727204073.56471: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204073.56486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204073.56504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204073.56524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12372 1727204073.56542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 12372 1727204073.56554: stderr chunk (state=3): >>>debug2: match not found <<< 12372 1727204073.56568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204073.56585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12372 1727204073.56601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 12372 1727204073.56695: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204073.56725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204073.56799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204073.58624: stdout chunk (state=3): >>>PLATFORM Linux <<< 12372 1727204073.58658: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 12372 1727204073.59162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204073.59166: stdout chunk (state=3): >>><<< 12372 1727204073.59168: stderr chunk (state=3): >>><<< 12372 1727204073.59171: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204073.59177 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 12372 1727204073.59183: _low_level_execute_command(): starting 12372 1727204073.59186: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 12372 1727204073.59617: Sending initial data 12372 1727204073.59623: Sent initial data (1181 bytes) 12372 1727204073.60288: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204073.60606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204073.60619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204073.60741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204073.64453: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 12372 1727204073.64797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204073.64885: stderr chunk (state=3): >>><<< 12372 1727204073.65006: stdout chunk (state=3): >>><<< 12372 1727204073.65036: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204073.65197: variable 'ansible_facts' from source: unknown 12372 1727204073.65205: variable 'ansible_facts' from source: unknown 12372 1727204073.65208: variable 'ansible_module_compression' from source: unknown 12372 1727204073.65403: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12372u51ts529/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12372 1727204073.65436: variable 'ansible_facts' from source: unknown 12372 1727204073.66033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/AnsiballZ_setup.py 12372 1727204073.66597: Sending initial data 12372 1727204073.66601: Sent initial data (153 bytes) 12372 1727204073.67118: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204073.67335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204073.67351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204073.67441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 12372 1727204073.67548: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204073.67676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204073.67749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204073.69685: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12372 1727204073.69775: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12372 1727204073.69805: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12372 1727204073.69839: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12372u51ts529/tmpkga905d_ /root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/AnsiballZ_setup.py <<< 12372 1727204073.69851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/AnsiballZ_setup.py" <<< 12372 1727204073.69885: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12372u51ts529/tmpkga905d_" to remote "/root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/AnsiballZ_setup.py" <<< 12372 1727204073.74473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204073.74896: stderr chunk (state=3): >>><<< 12372 1727204073.74900: stdout chunk (state=3): >>><<< 12372 1727204073.74902: done transferring module to remote 12372 1727204073.74904: _low_level_execute_command(): starting 12372 1727204073.74907: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/ /root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/AnsiballZ_setup.py && sleep 0' 12372 1727204073.76229: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204073.76348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204073.76363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204073.76380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204073.76630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204073.76715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204073.85982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204073.86231: stderr chunk (state=3): >>><<< 12372 1727204073.86236: stdout chunk (state=3): >>><<< 12372 1727204073.86239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204073.86242: _low_level_execute_command(): starting 12372 1727204073.86245: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/AnsiballZ_setup.py && sleep 0' 12372 1727204073.87355: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204073.87405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 12372 1727204073.87674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204073.87701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204073.87780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204073.90063: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12372 1727204073.90096: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 12372 1727204073.90163: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12372 1727204073.90205: stdout chunk (state=3): >>>import 'posix' # <<< 12372 1727204073.90247: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12372 1727204073.90301: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 12372 1727204073.90455: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12372 1727204073.90459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10772c4d0> <<< 12372 1727204073.90494: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1076fbad0> <<< 12372 1727204073.90564: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10772ea20> import '_signal' # import '_abc' # import 'abc' # <<< 12372 1727204073.90599: stdout chunk (state=3): >>>import 'io' # <<< 12372 1727204073.90632: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 12372 1727204073.90963: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1074fd0a0> <<< 12372 1727204073.91022: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1074fdfd0> <<< 12372 1727204073.91071: stdout chunk (state=3): >>>import 'site' # <<< 12372 1727204073.91106: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12372 1727204073.91756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12372 1727204073.91805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 12372 1727204073.91809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204073.91899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12372 1727204073.91942: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12372 1727204073.91948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12372 1727204073.92003: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10753be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12372 1727204073.92007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 12372 1727204073.92066: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10753bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12372 1727204073.92098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12372 1727204073.92150: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12372 1727204073.92202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204073.92251: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107573800> <<< 12372 1727204073.92299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107573e90> <<< 12372 1727204073.92404: stdout chunk (state=3): >>>import '_collections' # <<< 12372 1727204073.92456: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107553ad0> <<< 12372 1727204073.92494: stdout chunk (state=3): >>>import '_functools' # <<< 12372 1727204073.92550: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075511f0><<< 12372 1727204073.92646: stdout chunk (state=3): >>> <<< 12372 1727204073.92736: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107538fb0> <<< 12372 1727204073.92783: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12372 1727204073.92823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 12372 1727204073.92860: stdout chunk (state=3): >>>import '_sre' # <<< 12372 1727204073.92905: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 12372 1727204073.92947: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 12372 1727204073.92962: stdout chunk (state=3): >>> <<< 12372 1727204073.92996: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 12372 1727204073.93062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107597710><<< 12372 1727204073.93075: stdout chunk (state=3): >>> <<< 12372 1727204073.93113: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107596330><<< 12372 1727204073.93148: stdout chunk (state=3): >>> <<< 12372 1727204073.93151: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 12372 1727204073.93169: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075521e0><<< 12372 1727204073.93257: stdout chunk (state=3): >>> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10753aea0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 12372 1727204073.93296: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 12372 1727204073.93321: stdout chunk (state=3): >>> import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075c8740> <<< 12372 1727204073.93347: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107538230> <<< 12372 1727204073.93368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 12372 1727204073.93444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204073.93467: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075c8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075c8aa0><<< 12372 1727204073.93525: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204073.93562: stdout chunk (state=3): >>> # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075c8e90> <<< 12372 1727204073.93736: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107536d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204073.93801: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075c9550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075c9220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075ca450> import 'importlib.util' # import 'runpy' # <<< 12372 1727204073.93964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075e4680> <<< 12372 1727204073.94053: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075e5dc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075e6cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075e7320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075e6210> <<< 12372 1727204073.94059: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 12372 1727204073.94062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12372 1727204073.94174: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075e7da0> <<< 12372 1727204073.94240: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075e74d0> <<< 12372 1727204073.94247: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075ca4b0> <<< 12372 1727204073.94328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12372 1727204073.94335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12372 1727204073.94338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12372 1727204073.94344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12372 1727204073.94352: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204073.94354: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1072dbd10> <<< 12372 1727204073.94357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12372 1727204073.94400: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107304710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107304470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107304740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107304920> <<< 12372 1727204073.94424: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1072d9eb0> <<< 12372 1727204073.94441: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12372 1727204073.94631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107305f70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107304bf0> <<< 12372 1727204073.94653: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075caba0> <<< 12372 1727204073.94667: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12372 1727204073.94733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204073.94994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12372 1727204073.94998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1073322d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10734a3f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12372 1727204073.95030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12372 1727204073.95131: stdout chunk (state=3): >>>import 'ntpath' # <<< 12372 1727204073.95134: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1073871d0> <<< 12372 1727204073.95151: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12372 1727204073.95180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12372 1727204073.95239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12372 1727204073.95270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12372 1727204073.95340: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1073a9970> <<< 12372 1727204073.95421: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1073872f0> <<< 12372 1727204073.95466: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10734b080> <<< 12372 1727204073.95517: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1071c82c0> <<< 12372 1727204073.95521: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107349430> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107306e70> <<< 12372 1727204073.95948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe107349550> <<< 12372 1727204073.96045: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_33zdobdq/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 12372 1727204073.96271: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204073.96317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12372 1727204073.96341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12372 1727204073.96379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12372 1727204073.96493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12372 1727204073.96543: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10722e030> <<< 12372 1727204073.96556: stdout chunk (state=3): >>>import '_typing' # <<< 12372 1727204073.96868: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107204f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1071cbfb0> # zipimport: zlib available <<< 12372 1727204073.96907: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 12372 1727204073.96938: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204073.96975: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 12372 1727204073.99399: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.00879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107207ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107261af0> <<< 12372 1727204074.00911: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107261880> <<< 12372 1727204074.00944: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107261190> <<< 12372 1727204074.00978: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12372 1727204074.01025: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1072615e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10722ecc0> import 'atexit' # <<< 12372 1727204074.01059: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1072628a0> <<< 12372 1727204074.01117: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107262ae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12372 1727204074.01258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12372 1727204074.01262: stdout chunk (state=3): >>>import '_locale' # <<< 12372 1727204074.01265: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107262fc0> import 'pwd' # <<< 12372 1727204074.01268: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12372 1727204074.01360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12372 1727204074.01396: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c4e00> <<< 12372 1727204074.01399: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1070c6a20> <<< 12372 1727204074.01460: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12372 1727204074.01464: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c73b0> <<< 12372 1727204074.01525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12372 1727204074.01541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c8590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12372 1727204074.01659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12372 1727204074.01662: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070cb020> <<< 12372 1727204074.01798: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1070cb170> <<< 12372 1727204074.01806: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c91f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12372 1727204074.01929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12372 1727204074.01933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12372 1727204074.01979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070ceed0> import '_tokenize' # <<< 12372 1727204074.02295: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070cd9a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070cd700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12372 1727204074.02330: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070cff50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c97f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107112f90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107113140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107118d10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107118ad0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12372 1727204074.02439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12372 1727204074.02500: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204074.02533: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10711b2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107119400> <<< 12372 1727204074.02546: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12372 1727204074.02575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204074.02748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12372 1727204074.02763: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107122ae0> <<< 12372 1727204074.02841: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10711b470> <<< 12372 1727204074.03099: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107123dd0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107123bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107123f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107113440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12372 1727204074.03135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12372 1727204074.03158: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204074.03187: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107126b10> <<< 12372 1727204074.03382: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107127e90> <<< 12372 1727204074.03486: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1071252b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107126660> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107124e60> <<< 12372 1727204074.03607: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.03612: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 12372 1727204074.03894: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 12372 1727204074.03898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 12372 1727204074.03951: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.04163: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.04798: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.05516: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 12372 1727204074.05563: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 12372 1727204074.05596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204074.05640: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106fb0080> <<< 12372 1727204074.05753: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 12372 1727204074.05815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fb10a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10712b530> <<< 12372 1727204074.05904: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12372 1727204074.05930: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 12372 1727204074.06105: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.06336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 12372 1727204074.06360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fb1220> # zipimport: zlib available <<< 12372 1727204074.06988: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.07613: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.07671: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12372 1727204074.07694: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.07723: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.07778: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 12372 1727204074.08041: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 12372 1727204074.08075: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.08124: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12372 1727204074.08149: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.08417: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.08712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12372 1727204074.08803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12372 1727204074.08817: stdout chunk (state=3): >>>import '_ast' # <<< 12372 1727204074.08898: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fb3560> <<< 12372 1727204074.08935: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.08997: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.09140: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 12372 1727204074.09148: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12372 1727204074.09245: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204074.09356: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106fb9c40> <<< 12372 1727204074.09492: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106fba5a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fb2690> # zipimport: zlib available <<< 12372 1727204074.09599: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.09603: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12372 1727204074.09614: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.09647: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.09714: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.09792: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12372 1727204074.09832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204074.09932: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106fb9400> <<< 12372 1727204074.09973: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fba840> <<< 12372 1727204074.10025: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 12372 1727204074.10234: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.10251: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204074.10294: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12372 1727204074.10318: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12372 1727204074.10377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12372 1727204074.10410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12372 1727204074.10647: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107052b10> <<< 12372 1727204074.10650: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fc47d0> <<< 12372 1727204074.10680: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fc3c50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fc26f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.10700: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12372 1727204074.10790: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 12372 1727204074.10914: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 12372 1727204074.10944: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.10971: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.11020: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.11032: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.11331: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.11335: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.11495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 12372 1727204074.11597: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.11911: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204074.11956: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 12372 1727204074.11981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 12372 1727204074.12024: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107059100> <<< 12372 1727204074.12060: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 12372 1727204074.12077: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12372 1727204074.12132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 12372 1727204074.12365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654c140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10654c440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fa51f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fa47a0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10705b3b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10705aab0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 12372 1727204074.12411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 12372 1727204074.12435: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 12372 1727204074.12681: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10654f380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654ec30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10654ee10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654e090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12372 1727204074.12707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654f4d0> <<< 12372 1727204074.12727: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 12372 1727204074.13128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1065ba000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654ffe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10705a840> <<< 12372 1727204074.13132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 12372 1727204074.13152: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.13184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 12372 1727204074.13231: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 12372 1727204074.13260: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.13277: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.13309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 12372 1727204074.13407: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.13428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 12372 1727204074.13475: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.13584: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 12372 1727204074.13587: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.13640: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.13683: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.13718: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.13798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 12372 1727204074.13900: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.14359: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.14871: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 12372 1727204074.14920: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.14985: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.15026: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.15207: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.15254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.15280: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 12372 1727204074.15320: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.15357: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 12372 1727204074.15437: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.15454: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 12372 1727204074.15533: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.15666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 12372 1727204074.15695: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1065bb3b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 12372 1727204074.15708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12372 1727204074.15903: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1065bac60> import 'ansible.module_utils.facts.system.local' # <<< 12372 1727204074.15925: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.15975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 12372 1727204074.15999: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.16087: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.16186: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 12372 1727204074.16208: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.16354: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.16391: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 12372 1727204074.16410: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.16471: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 12372 1727204074.16586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12372 1727204074.16610: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204074.16647: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1065e62a0> <<< 12372 1727204074.16856: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1065d3140> import 'ansible.module_utils.facts.system.python' # <<< 12372 1727204074.16922: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.16940: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.17224: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.17305: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.17465: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 12372 1727204074.17486: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 12372 1727204074.17518: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.17564: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 12372 1727204074.17611: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.17670: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12372 1727204074.17711: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204074.17743: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106401df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106401a60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 12372 1727204074.17813: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 12372 1727204074.17829: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.17881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 12372 1727204074.17987: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.18038: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.18222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 12372 1727204074.18333: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.18441: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.18480: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.18536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 12372 1727204074.18593: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.18597: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.18752: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.18914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 12372 1727204074.18935: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.19058: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.19210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 12372 1727204074.19319: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.19340: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.19918: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.20481: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 12372 1727204074.20499: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.20722: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 12372 1727204074.20860: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.20966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 12372 1727204074.21120: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.21380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 12372 1727204074.21408: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.21437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 12372 1727204074.21545: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.21653: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.21878: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.22110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 12372 1727204074.22206: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 12372 1727204074.22262: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 12372 1727204074.22350: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.22422: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 12372 1727204074.22447: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.22460: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.22506: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 12372 1727204074.22748: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.22752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.22755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12372 1727204074.22757: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.23143: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.23368: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 12372 1727204074.23486: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.23555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 12372 1727204074.23559: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.23618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.23649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 12372 1727204074.23924: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.24006: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.24053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 12372 1727204074.24068: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.24091: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.24112: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.24381: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.24407: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 12372 1727204074.24437: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.24491: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 12372 1727204074.24597: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.24896: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.24937: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 12372 1727204074.24946: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.25111: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204074.25156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 12372 1727204074.25167: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.25255: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.25350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 12372 1727204074.25371: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.25461: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.25558: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12372 1727204074.25870: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204074.26903: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 12372 1727204074.26921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 12372 1727204074.26925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 12372 1727204074.26928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12372 1727204074.27100: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10642bda0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106428650> <<< 12372 1727204074.27104: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106429730> <<< 12372 1727204074.44875: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 12372 1727204074.44909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 12372 1727204074.44943: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106470b60> <<< 12372 1727204074.44981: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 12372 1727204074.45019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 12372 1727204074.45023: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106471af0> <<< 12372 1727204074.45097: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204074.45130: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 12372 1727204074.45251: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1064c41d0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106473890> <<< 12372 1727204074.45563: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 12372 1727204074.66185: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": <<< 12372 1727204074.66233: stdout chunk (state=3): >>>"", "ansible_loadavg": {"1m": 0.4619140625, "5m": 0.39013671875, "15m": 0.23046875}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "34", "epoch": "1727204074", "epoch_int": "1727204074", "date": "2024-09-24", "time": "14:54:34", "iso8601_micro": "2024-09-24T18:54:34.272160Z", "iso8601": "2024-09-24T18:54:34Z", "iso8601_basic": "20240924T145434272160", "iso8601_basic_short": "20240924T145434", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2871, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 846, "free": 2871}, "nocache": {"free": 3479, "used": 238}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272<<< 12372 1727204074.66254: stdout chunk (state=3): >>>ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 578, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251164712960, "block_size": 4096, "block_total": 64479564, "block_available": 61319510, "block_used": 3160054, "inode_total": 16384000, "inode_available": 16302333, "inode_used": 81667, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12372 1727204074.66879: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 12372 1727204074.67241: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ <<< 12372 1727204074.67479: stdout chunk (state=3): >>># destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cl<<< 12372 1727204074.67503: stdout chunk (state=3): >>>eanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 12372 1727204074.67876: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12372 1727204074.68097: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 12372 1727204074.68129: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 12372 1727204074.68194: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 <<< 12372 1727204074.68226: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 12372 1727204074.68266: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 12372 1727204074.68365: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues <<< 12372 1727204074.68465: stdout chunk (state=3): >>># destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 12372 1727204074.68469: stdout chunk (state=3): >>># destroy _pickle <<< 12372 1727204074.68611: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct <<< 12372 1727204074.68699: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout <<< 12372 1727204074.68703: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 12372 1727204074.68740: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 12372 1727204074.69007: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 12372 1727204074.69011: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12372 1727204074.69014: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib <<< 12372 1727204074.69047: stdout chunk (state=3): >>># cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12372 1727204074.69239: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12372 1727204074.69273: stdout chunk (state=3): >>># destroy _collections <<< 12372 1727204074.69605: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 12372 1727204074.69614: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 12372 1727204074.69639: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io <<< 12372 1727204074.69741: stdout chunk (state=3): >>># destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 12372 1727204074.69772: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 <<< 12372 1727204074.69783: stdout chunk (state=3): >>># destroy _sre <<< 12372 1727204074.69827: stdout chunk (state=3): >>># destroy _string # destroy re <<< 12372 1727204074.69832: stdout chunk (state=3): >>># destroy itertools <<< 12372 1727204074.69885: stdout chunk (state=3): >>># destroy _abc <<< 12372 1727204074.69909: stdout chunk (state=3): >>># destroy posix <<< 12372 1727204074.69913: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread <<< 12372 1727204074.69919: stdout chunk (state=3): >>># clear sys.audit hooks <<< 12372 1727204074.70578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 12372 1727204074.70582: stdout chunk (state=3): >>><<< 12372 1727204074.70588: stderr chunk (state=3): >>><<< 12372 1727204074.70831: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10772c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1076fbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10772ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1074fd0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1074fdfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10753be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10753bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107573800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107573e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107553ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075511f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107538fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107597710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107596330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075521e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10753aea0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075c8740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107538230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075c8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075c8aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075c8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107536d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075c9550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075c9220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075ca450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075e4680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075e5dc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075e6cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075e7320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075e6210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1075e7da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075e74d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075ca4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1072dbd10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107304710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107304470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107304740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107304920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1072d9eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107305f70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107304bf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1075caba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1073322d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10734a3f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1073871d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1073a9970> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1073872f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10734b080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1071c82c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107349430> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107306e70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe107349550> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_33zdobdq/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10722e030> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107204f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1071cbfb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107207ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107261af0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107261880> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107261190> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1072615e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10722ecc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1072628a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107262ae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107262fc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c4e00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1070c6a20> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c73b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c8590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070cb020> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1070cb170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c91f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070ceed0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070cd9a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070cd700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070cff50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1070c97f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107112f90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107113140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107118d10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107118ad0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10711b2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107119400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107122ae0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10711b470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107123dd0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107123bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107123f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107113440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107126b10> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107127e90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1071252b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe107126660> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107124e60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106fb0080> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fb10a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10712b530> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fb1220> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fb3560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106fb9c40> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106fba5a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fb2690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106fb9400> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fba840> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107052b10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fc47d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fc3c50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fc26f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe107059100> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654c140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10654c440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fa51f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106fa47a0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10705b3b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10705aab0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10654f380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654ec30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10654ee10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654e090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654f4d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1065ba000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10654ffe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe10705a840> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1065bb3b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1065bac60> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1065e62a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1065d3140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe106401df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106401a60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe10642bda0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106428650> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106429730> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106470b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106471af0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1064c41d0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe106473890> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "", "ansible_loadavg": {"1m": 0.4619140625, "5m": 0.39013671875, "15m": 0.23046875}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "34", "epoch": "1727204074", "epoch_int": "1727204074", "date": "2024-09-24", "time": "14:54:34", "iso8601_micro": "2024-09-24T18:54:34.272160Z", "iso8601": "2024-09-24T18:54:34Z", "iso8601_basic": "20240924T145434272160", "iso8601_basic_short": "20240924T145434", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2871, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 846, "free": 2871}, "nocache": {"free": 3479, "used": 238}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 578, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251164712960, "block_size": 4096, "block_total": 64479564, "block_available": 61319510, "block_used": 3160054, "inode_total": 16384000, "inode_available": 16302333, "inode_used": 81667, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 12372 1727204074.73008: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12372 1727204074.73012: _low_level_execute_command(): starting 12372 1727204074.73014: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204073.0815492-12672-26799053669801/ > /dev/null 2>&1 && sleep 0' 12372 1727204074.73283: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204074.73303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204074.73335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12372 1727204074.73454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12372 1727204074.73468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204074.73484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204074.73577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204074.76531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204074.76543: stdout chunk (state=3): >>><<< 12372 1727204074.76560: stderr chunk (state=3): >>><<< 12372 1727204074.76796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204074.76800: handler run complete 12372 1727204074.76996: variable 'ansible_facts' from source: unknown 12372 1727204074.77351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204074.77876: variable 'ansible_facts' from source: unknown 12372 1727204074.78012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204074.79580: attempt loop complete, returning result 12372 1727204074.79598: _execute() done 12372 1727204074.79606: dumping result to json 12372 1727204074.79651: done dumping result, returning 12372 1727204074.79667: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-244a-02f9-0000000001bc] 12372 1727204074.79696: sending task result for task 12b410aa-8751-244a-02f9-0000000001bc ok: [managed-node3] 12372 1727204074.80872: done sending task result for task 12b410aa-8751-244a-02f9-0000000001bc 12372 1727204074.81265: WORKER PROCESS EXITING 12372 1727204074.81258: no more pending results, returning what we have 12372 1727204074.81278: results queue empty 12372 1727204074.81279: checking for any_errors_fatal 12372 1727204074.81281: done checking for any_errors_fatal 12372 1727204074.81282: checking for max_fail_percentage 12372 1727204074.81283: done checking for max_fail_percentage 12372 1727204074.81284: checking to see if all hosts have failed and the running result is not ok 12372 1727204074.81285: done checking to see if all hosts have failed 12372 1727204074.81286: getting the remaining hosts for this loop 12372 1727204074.81288: done getting the remaining hosts for this loop 12372 1727204074.81296: getting the next task for host managed-node3 12372 1727204074.81303: done getting next task for host managed-node3 12372 1727204074.81305: ^ task is: TASK: meta (flush_handlers) 12372 1727204074.81308: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204074.81312: getting variables 12372 1727204074.81314: in VariableManager get_vars() 12372 1727204074.81495: Calling all_inventory to load vars for managed-node3 12372 1727204074.81501: Calling groups_inventory to load vars for managed-node3 12372 1727204074.81532: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204074.81545: Calling all_plugins_play to load vars for managed-node3 12372 1727204074.81550: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204074.81554: Calling groups_plugins_play to load vars for managed-node3 12372 1727204074.81861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204074.82181: done with get_vars() 12372 1727204074.82193: done getting variables 12372 1727204074.82256: in VariableManager get_vars() 12372 1727204074.82267: Calling all_inventory to load vars for managed-node3 12372 1727204074.82270: Calling groups_inventory to load vars for managed-node3 12372 1727204074.82274: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204074.82278: Calling all_plugins_play to load vars for managed-node3 12372 1727204074.82280: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204074.82282: Calling groups_plugins_play to load vars for managed-node3 12372 1727204074.82410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204074.82566: done with get_vars() 12372 1727204074.82580: done queuing things up, now waiting for results queue to drain 12372 1727204074.82582: results queue empty 12372 1727204074.82582: checking for any_errors_fatal 12372 1727204074.82584: done checking for any_errors_fatal 12372 1727204074.82585: checking for max_fail_percentage 12372 1727204074.82586: done checking for max_fail_percentage 12372 1727204074.82586: checking to see if all hosts have failed and the running result is not ok 12372 1727204074.82593: done checking to see if all hosts have failed 12372 1727204074.82594: getting the remaining hosts for this loop 12372 1727204074.82595: done getting the remaining hosts for this loop 12372 1727204074.82598: getting the next task for host managed-node3 12372 1727204074.82602: done getting next task for host managed-node3 12372 1727204074.82604: ^ task is: TASK: Include the task 'el_repo_setup.yml' 12372 1727204074.82606: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204074.82607: getting variables 12372 1727204074.82608: in VariableManager get_vars() 12372 1727204074.82615: Calling all_inventory to load vars for managed-node3 12372 1727204074.82618: Calling groups_inventory to load vars for managed-node3 12372 1727204074.82620: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204074.82624: Calling all_plugins_play to load vars for managed-node3 12372 1727204074.82627: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204074.82630: Calling groups_plugins_play to load vars for managed-node3 12372 1727204074.82742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204074.82894: done with get_vars() 12372 1727204074.82900: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:10 Tuesday 24 September 2024 14:54:34 -0400 (0:00:01.800) 0:00:01.815 ***** 12372 1727204074.82968: entering _queue_task() for managed-node3/include_tasks 12372 1727204074.82970: Creating lock for include_tasks 12372 1727204074.83230: worker is 1 (out of 1 available) 12372 1727204074.83246: exiting _queue_task() for managed-node3/include_tasks 12372 1727204074.83261: done queuing things up, now waiting for results queue to drain 12372 1727204074.83263: waiting for pending results... 12372 1727204074.83421: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 12372 1727204074.83487: in run() - task 12b410aa-8751-244a-02f9-000000000006 12372 1727204074.83533: variable 'ansible_search_path' from source: unknown 12372 1727204074.83550: calling self._execute() 12372 1727204074.83612: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204074.83620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204074.83642: variable 'omit' from source: magic vars 12372 1727204074.83894: _execute() done 12372 1727204074.83897: dumping result to json 12372 1727204074.83900: done dumping result, returning 12372 1727204074.83902: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-244a-02f9-000000000006] 12372 1727204074.83905: sending task result for task 12b410aa-8751-244a-02f9-000000000006 12372 1727204074.83992: done sending task result for task 12b410aa-8751-244a-02f9-000000000006 12372 1727204074.83996: WORKER PROCESS EXITING 12372 1727204074.84208: no more pending results, returning what we have 12372 1727204074.84213: in VariableManager get_vars() 12372 1727204074.84242: Calling all_inventory to load vars for managed-node3 12372 1727204074.84245: Calling groups_inventory to load vars for managed-node3 12372 1727204074.84249: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204074.84258: Calling all_plugins_play to load vars for managed-node3 12372 1727204074.84261: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204074.84264: Calling groups_plugins_play to load vars for managed-node3 12372 1727204074.85103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204074.85369: done with get_vars() 12372 1727204074.85378: variable 'ansible_search_path' from source: unknown 12372 1727204074.85394: we have included files to process 12372 1727204074.85395: generating all_blocks data 12372 1727204074.85396: done generating all_blocks data 12372 1727204074.85397: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12372 1727204074.85399: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12372 1727204074.85402: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12372 1727204074.86186: in VariableManager get_vars() 12372 1727204074.86207: done with get_vars() 12372 1727204074.86221: done processing included file 12372 1727204074.86224: iterating over new_blocks loaded from include file 12372 1727204074.86226: in VariableManager get_vars() 12372 1727204074.86237: done with get_vars() 12372 1727204074.86239: filtering new block on tags 12372 1727204074.86257: done filtering new block on tags 12372 1727204074.86260: in VariableManager get_vars() 12372 1727204074.86294: done with get_vars() 12372 1727204074.86297: filtering new block on tags 12372 1727204074.86318: done filtering new block on tags 12372 1727204074.86321: in VariableManager get_vars() 12372 1727204074.86333: done with get_vars() 12372 1727204074.86335: filtering new block on tags 12372 1727204074.86353: done filtering new block on tags 12372 1727204074.86355: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 12372 1727204074.86361: extending task lists for all hosts with included blocks 12372 1727204074.86423: done extending task lists 12372 1727204074.86425: done processing included files 12372 1727204074.86426: results queue empty 12372 1727204074.86427: checking for any_errors_fatal 12372 1727204074.86428: done checking for any_errors_fatal 12372 1727204074.86429: checking for max_fail_percentage 12372 1727204074.86431: done checking for max_fail_percentage 12372 1727204074.86432: checking to see if all hosts have failed and the running result is not ok 12372 1727204074.86433: done checking to see if all hosts have failed 12372 1727204074.86434: getting the remaining hosts for this loop 12372 1727204074.86435: done getting the remaining hosts for this loop 12372 1727204074.86438: getting the next task for host managed-node3 12372 1727204074.86443: done getting next task for host managed-node3 12372 1727204074.86445: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 12372 1727204074.86448: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204074.86450: getting variables 12372 1727204074.86451: in VariableManager get_vars() 12372 1727204074.86461: Calling all_inventory to load vars for managed-node3 12372 1727204074.86463: Calling groups_inventory to load vars for managed-node3 12372 1727204074.86466: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204074.86473: Calling all_plugins_play to load vars for managed-node3 12372 1727204074.86476: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204074.86479: Calling groups_plugins_play to load vars for managed-node3 12372 1727204074.86668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204074.86934: done with get_vars() 12372 1727204074.86944: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:54:34 -0400 (0:00:00.040) 0:00:01.855 ***** 12372 1727204074.87023: entering _queue_task() for managed-node3/setup 12372 1727204074.87708: worker is 1 (out of 1 available) 12372 1727204074.87723: exiting _queue_task() for managed-node3/setup 12372 1727204074.87737: done queuing things up, now waiting for results queue to drain 12372 1727204074.87739: waiting for pending results... 12372 1727204074.88229: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 12372 1727204074.88543: in run() - task 12b410aa-8751-244a-02f9-0000000001cd 12372 1727204074.88547: variable 'ansible_search_path' from source: unknown 12372 1727204074.88550: variable 'ansible_search_path' from source: unknown 12372 1727204074.88598: calling self._execute() 12372 1727204074.88878: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204074.88887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204074.88903: variable 'omit' from source: magic vars 12372 1727204074.90170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204074.95911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204074.96142: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204074.96334: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204074.96443: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204074.96459: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204074.96769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204074.96879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204074.96986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204074.97013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204074.97080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204074.97643: variable 'ansible_facts' from source: unknown 12372 1727204074.97834: variable 'network_test_required_facts' from source: task vars 12372 1727204074.97980: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 12372 1727204074.97993: variable 'omit' from source: magic vars 12372 1727204074.98145: variable 'omit' from source: magic vars 12372 1727204074.98191: variable 'omit' from source: magic vars 12372 1727204074.98281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12372 1727204074.98419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12372 1727204074.98468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12372 1727204074.98576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204074.98581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204074.98601: variable 'inventory_hostname' from source: host vars for 'managed-node3' 12372 1727204074.98641: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204074.98654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204074.99103: Set connection var ansible_connection to ssh 12372 1727204074.99106: Set connection var ansible_timeout to 10 12372 1727204074.99109: Set connection var ansible_module_compression to ZIP_DEFLATED 12372 1727204074.99112: Set connection var ansible_shell_executable to /bin/sh 12372 1727204074.99114: Set connection var ansible_shell_type to sh 12372 1727204074.99119: Set connection var ansible_pipelining to False 12372 1727204074.99121: variable 'ansible_shell_executable' from source: unknown 12372 1727204074.99124: variable 'ansible_connection' from source: unknown 12372 1727204074.99126: variable 'ansible_module_compression' from source: unknown 12372 1727204074.99128: variable 'ansible_shell_type' from source: unknown 12372 1727204074.99131: variable 'ansible_shell_executable' from source: unknown 12372 1727204074.99133: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204074.99135: variable 'ansible_pipelining' from source: unknown 12372 1727204074.99137: variable 'ansible_timeout' from source: unknown 12372 1727204074.99139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204074.99598: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12372 1727204074.99656: variable 'omit' from source: magic vars 12372 1727204074.99666: starting attempt loop 12372 1727204074.99674: running the handler 12372 1727204074.99767: _low_level_execute_command(): starting 12372 1727204074.99781: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12372 1727204075.01541: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204075.01642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.01732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12372 1727204075.01760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204075.01875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204075.01954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204075.03732: stdout chunk (state=3): >>>/root <<< 12372 1727204075.03838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204075.04082: stderr chunk (state=3): >>><<< 12372 1727204075.04086: stdout chunk (state=3): >>><<< 12372 1727204075.04088: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204075.04100: _low_level_execute_command(): starting 12372 1727204075.04105: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317 `" && echo ansible-tmp-1727204075.0402052-12794-272177309588317="` echo /root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317 `" ) && sleep 0' 12372 1727204075.05667: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.05702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12372 1727204075.05931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204075.05969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204075.08031: stdout chunk (state=3): >>>ansible-tmp-1727204075.0402052-12794-272177309588317=/root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317 <<< 12372 1727204075.08426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204075.08430: stdout chunk (state=3): >>><<< 12372 1727204075.08433: stderr chunk (state=3): >>><<< 12372 1727204075.08594: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204075.0402052-12794-272177309588317=/root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204075.08598: variable 'ansible_module_compression' from source: unknown 12372 1727204075.08601: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12372u51ts529/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12372 1727204075.08836: variable 'ansible_facts' from source: unknown 12372 1727204075.09368: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/AnsiballZ_setup.py 12372 1727204075.09582: Sending initial data 12372 1727204075.09804: Sent initial data (154 bytes) 12372 1727204075.11047: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204075.11507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.11721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204075.12018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204075.12035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204075.13671: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12372 1727204075.13704: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12372 1727204075.13738: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12372u51ts529/tmp1c5ngcsv /root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/AnsiballZ_setup.py <<< 12372 1727204075.13754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/AnsiballZ_setup.py" <<< 12372 1727204075.13802: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 12372 1727204075.13823: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12372u51ts529/tmp1c5ngcsv" to remote "/root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/AnsiballZ_setup.py" <<< 12372 1727204075.13902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/AnsiballZ_setup.py" <<< 12372 1727204075.18909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204075.19077: stderr chunk (state=3): >>><<< 12372 1727204075.19081: stdout chunk (state=3): >>><<< 12372 1727204075.19084: done transferring module to remote 12372 1727204075.19086: _low_level_execute_command(): starting 12372 1727204075.19091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/ /root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/AnsiballZ_setup.py && sleep 0' 12372 1727204075.19867: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204075.19885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204075.19905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204075.19931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12372 1727204075.19949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 12372 1727204075.19997: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.20127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12372 1727204075.20168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204075.20211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204075.20538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204075.22658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204075.22787: stderr chunk (state=3): >>><<< 12372 1727204075.22804: stdout chunk (state=3): >>><<< 12372 1727204075.22868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204075.22878: _low_level_execute_command(): starting 12372 1727204075.22894: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/AnsiballZ_setup.py && sleep 0' 12372 1727204075.23574: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.23692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12372 1727204075.23707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204075.23725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204075.23815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204075.26560: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 12372 1727204075.26599: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12372 1727204075.26708: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12372 1727204075.26750: stdout chunk (state=3): >>>import 'posix' # <<< 12372 1727204075.26817: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12372 1727204075.26841: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 12372 1727204075.26925: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12372 1727204075.26950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 12372 1727204075.26979: stdout chunk (state=3): >>>import 'codecs' # <<< 12372 1727204075.27067: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e2cc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e29bad0> <<< 12372 1727204075.27125: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e2cea20> <<< 12372 1727204075.27175: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 12372 1727204075.27202: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 12372 1727204075.27250: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 12372 1727204075.27496: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages<<< 12372 1727204075.27541: stdout chunk (state=3): >>> Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 12372 1727204075.27545: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12372 1727204075.27579: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12372 1727204075.27615: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e07d0a0> <<< 12372 1727204075.27719: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 12372 1727204075.27724: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e07dfd0> <<< 12372 1727204075.27786: stdout chunk (state=3): >>>import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12372 1727204075.28463: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12372 1727204075.28467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12372 1727204075.28513: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204075.28538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12372 1727204075.28612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12372 1727204075.28637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12372 1727204075.28660: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0bbec0> <<< 12372 1727204075.28697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 12372 1727204075.28736: stdout chunk (state=3): >>>import '_operator' # <<< 12372 1727204075.28757: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0bbf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12372 1727204075.28802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12372 1727204075.28894: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204075.28927: stdout chunk (state=3): >>>import 'itertools' # <<< 12372 1727204075.28962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0f38c0> <<< 12372 1727204075.28997: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 12372 1727204075.29018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0f3f20> import '_collections' # <<< 12372 1727204075.29088: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0d3b90> <<< 12372 1727204075.29143: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0d12b0> <<< 12372 1727204075.29299: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0b9070> <<< 12372 1727204075.29324: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12372 1727204075.29349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 12372 1727204075.29395: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 12372 1727204075.29427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 12372 1727204075.29450: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 12372 1727204075.29531: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e117740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e116360> <<< 12372 1727204075.29545: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 12372 1727204075.29635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0d22a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0baf60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 12372 1727204075.29665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e148770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0b82f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 12372 1727204075.29682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12372 1727204075.29719: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e148c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e148ad0> <<< 12372 1727204075.29758: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.29864: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e148e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0b6e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 12372 1727204075.29872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 12372 1727204075.29934: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e149520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e149220> import 'importlib.machinery' # <<< 12372 1727204075.29939: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 12372 1727204075.29969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e14a420> <<< 12372 1727204075.30005: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 12372 1727204075.30008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 12372 1727204075.30095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e164650> <<< 12372 1727204075.30144: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.30150: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e165d90> <<< 12372 1727204075.30186: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 12372 1727204075.30226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e166c90> <<< 12372 1727204075.30276: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e1672f0> <<< 12372 1727204075.30308: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e1661e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 12372 1727204075.30328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12372 1727204075.30363: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e167d70> <<< 12372 1727204075.30445: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e1674a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e14a480> <<< 12372 1727204075.30462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12372 1727204075.30512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12372 1727204075.30528: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12372 1727204075.30548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12372 1727204075.30621: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.30643: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943de5bcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.30685: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943de847a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de84500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943de847d0> <<< 12372 1727204075.30713: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943de849b0> <<< 12372 1727204075.30760: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de59e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12372 1727204075.30912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12372 1727204075.30954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 12372 1727204075.31003: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de86060> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de84ce0> <<< 12372 1727204075.31018: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e14a600> <<< 12372 1727204075.31040: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12372 1727204075.31141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12372 1727204075.31200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12372 1727204075.31322: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943deb2420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12372 1727204075.31337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204075.31369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12372 1727204075.31372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12372 1727204075.31446: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943deca5a0> <<< 12372 1727204075.31471: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12372 1727204075.31544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12372 1727204075.31624: stdout chunk (state=3): >>>import 'ntpath' # <<< 12372 1727204075.31680: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943df03350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12372 1727204075.31728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12372 1727204075.31814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12372 1727204075.31955: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943df29af0> <<< 12372 1727204075.32072: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943df03470> <<< 12372 1727204075.32129: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943decb230> <<< 12372 1727204075.32172: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dd444a0> <<< 12372 1727204075.32347: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dec95e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de86fc0> <<< 12372 1727204075.32475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f943dd44770> <<< 12372 1727204075.32766: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_75d3kfuy/ansible_setup_payload.zip' <<< 12372 1727204075.32971: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.33036: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.33074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12372 1727204075.33086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12372 1727204075.33142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12372 1727204075.33253: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12372 1727204075.33299: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943ddb2240> <<< 12372 1727204075.33310: stdout chunk (state=3): >>>import '_typing' # <<< 12372 1727204075.33616: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dd89130> <<< 12372 1727204075.33642: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dd88290> # zipimport: zlib available <<< 12372 1727204075.33677: stdout chunk (state=3): >>>import 'ansible' # <<< 12372 1727204075.33681: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.33729: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.33733: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.33759: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 12372 1727204075.36340: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.38492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 12372 1727204075.38534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 12372 1727204075.38543: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dd8b230> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204075.38565: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 12372 1727204075.38613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 12372 1727204075.38616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12372 1727204075.38643: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dde1bb0> <<< 12372 1727204075.38738: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dde1940> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dde1250> <<< 12372 1727204075.38755: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 12372 1727204075.38774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12372 1727204075.38819: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dde1ca0> <<< 12372 1727204075.38851: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943ddb2c60> import 'atexit' # <<< 12372 1727204075.38916: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dde2960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dde2ba0> <<< 12372 1727204075.38937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12372 1727204075.39028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 12372 1727204075.39081: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dde30b0> <<< 12372 1727204075.39128: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12372 1727204075.39157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12372 1727204075.39258: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc48e90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dc4aab0> <<< 12372 1727204075.39283: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12372 1727204075.39355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4b350> <<< 12372 1727204075.39372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12372 1727204075.39425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12372 1727204075.39455: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4c530><<< 12372 1727204075.39501: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12372 1727204075.39563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12372 1727204075.39602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 12372 1727204075.39625: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 12372 1727204075.39723: stdout chunk (state=3): >>> import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4eff0> <<< 12372 1727204075.39797: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.39817: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dc4f110> <<< 12372 1727204075.39853: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4d2b0> <<< 12372 1727204075.39884: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 12372 1727204075.39893: stdout chunk (state=3): >>> <<< 12372 1727204075.39938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 12372 1727204075.39943: stdout chunk (state=3): >>> <<< 12372 1727204075.40050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12372 1727204075.40065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 12372 1727204075.40095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 12372 1727204075.40114: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc52f60> <<< 12372 1727204075.40142: stdout chunk (state=3): >>>import '_tokenize' # <<< 12372 1727204075.40259: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc51a30> <<< 12372 1727204075.40272: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc51790><<< 12372 1727204075.40309: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 12372 1727204075.40338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 12372 1727204075.40341: stdout chunk (state=3): >>> <<< 12372 1727204075.40513: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc53e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4d7c0><<< 12372 1727204075.40520: stdout chunk (state=3): >>> <<< 12372 1727204075.40561: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.40566: stdout chunk (state=3): >>> <<< 12372 1727204075.40582: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.40628: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dc97050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 12372 1727204075.40642: stdout chunk (state=3): >>> <<< 12372 1727204075.40651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204075.40691: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc97200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 12372 1727204075.40697: stdout chunk (state=3): >>> <<< 12372 1727204075.40731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 12372 1727204075.40736: stdout chunk (state=3): >>> <<< 12372 1727204075.40768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 12372 1727204075.40774: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 12372 1727204075.40827: stdout chunk (state=3): >>> # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.40849: stdout chunk (state=3): >>> <<< 12372 1727204075.40859: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca0dd0><<< 12372 1727204075.40872: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca0b90><<< 12372 1727204075.40910: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12372 1727204075.41094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 12372 1727204075.41099: stdout chunk (state=3): >>> <<< 12372 1727204075.41163: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.41188: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca3320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca14c0><<< 12372 1727204075.41225: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 12372 1727204075.41234: stdout chunk (state=3): >>> <<< 12372 1727204075.41306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 12372 1727204075.41312: stdout chunk (state=3): >>> <<< 12372 1727204075.41341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 12372 1727204075.41366: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 12372 1727204075.41438: stdout chunk (state=3): >>>import '_string' # <<< 12372 1727204075.41482: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca6a80><<< 12372 1727204075.41485: stdout chunk (state=3): >>> <<< 12372 1727204075.41753: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca3410> <<< 12372 1727204075.41864: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.41878: stdout chunk (state=3): >>> <<< 12372 1727204075.41891: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca77a0><<< 12372 1727204075.41934: stdout chunk (state=3): >>> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.41962: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.41965: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca4290><<< 12372 1727204075.42048: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.42070: stdout chunk (state=3): >>> <<< 12372 1727204075.42073: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.42087: stdout chunk (state=3): >>> <<< 12372 1727204075.42095: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca7b30> <<< 12372 1727204075.42124: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc97500><<< 12372 1727204075.42130: stdout chunk (state=3): >>> <<< 12372 1727204075.42167: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 12372 1727204075.42171: stdout chunk (state=3): >>> <<< 12372 1727204075.42187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 12372 1727204075.42223: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12372 1727204075.42266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 12372 1727204075.42316: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.42325: stdout chunk (state=3): >>> <<< 12372 1727204075.42376: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.42380: stdout chunk (state=3): >>> <<< 12372 1727204075.42397: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dcab440> <<< 12372 1727204075.42741: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.42749: stdout chunk (state=3): >>> <<< 12372 1727204075.42775: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dcac770> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca9bb0><<< 12372 1727204075.42781: stdout chunk (state=3): >>> <<< 12372 1727204075.42820: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.42837: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dcaaf60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca97c0><<< 12372 1727204075.42868: stdout chunk (state=3): >>> # zipimport: zlib available<<< 12372 1727204075.42875: stdout chunk (state=3): >>> <<< 12372 1727204075.42898: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.42927: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # <<< 12372 1727204075.42958: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12372 1727204075.43137: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.43327: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.43364: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 12372 1727204075.43404: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.43446: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # <<< 12372 1727204075.43450: stdout chunk (state=3): >>> # zipimport: zlib available<<< 12372 1727204075.43468: stdout chunk (state=3): >>> <<< 12372 1727204075.43715: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.43747: stdout chunk (state=3): >>> <<< 12372 1727204075.43921: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.43924: stdout chunk (state=3): >>> <<< 12372 1727204075.45095: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.46236: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 12372 1727204075.46256: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 12372 1727204075.46271: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 12372 1727204075.46345: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 12372 1727204075.46352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204075.46434: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.46441: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943db348f0> <<< 12372 1727204075.46619: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 12372 1727204075.46627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12372 1727204075.46670: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db35760> <<< 12372 1727204075.46735: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dcaf170> <<< 12372 1727204075.46765: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12372 1727204075.46785: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.46820: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.46852: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 12372 1727204075.46879: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.47344: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.47649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db35790> # zipimport: zlib available <<< 12372 1727204075.48448: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.49403: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.49524: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.49667: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12372 1727204075.49687: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.49756: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.49817: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12372 1727204075.49842: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.49974: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.50165: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12372 1727204075.50190: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.50221: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.50229: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 12372 1727204075.50260: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.50323: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.50386: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12372 1727204075.50410: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.50892: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.51368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12372 1727204075.51500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12372 1727204075.51516: stdout chunk (state=3): >>>import '_ast' # <<< 12372 1727204075.51664: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db367e0> <<< 12372 1727204075.51743: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.51818: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.51946: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 12372 1727204075.51960: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 12372 1727204075.51980: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 12372 1727204075.52027: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 12372 1727204075.52031: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 12372 1727204075.52148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12372 1727204075.52167: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.52360: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.52370: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943db3e120> <<< 12372 1727204075.52484: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.52501: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943db3eab0> <<< 12372 1727204075.52510: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca7350> <<< 12372 1727204075.52520: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.52577: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.52648: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12372 1727204075.52660: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.52896: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.52919: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.53043: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12372 1727204075.53074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204075.53208: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.53250: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943db3d8e0> <<< 12372 1727204075.53304: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db3ecc0> <<< 12372 1727204075.53363: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 12372 1727204075.53394: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 12372 1727204075.53491: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.53604: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.53657: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.53711: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 12372 1727204075.53842: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12372 1727204075.53940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12372 1727204075.53944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12372 1727204075.53948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12372 1727204075.54046: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd2e10> <<< 12372 1727204075.54121: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db48b90> <<< 12372 1727204075.54272: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db46d20> <<< 12372 1727204075.54275: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db46b70> <<< 12372 1727204075.54312: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 12372 1727204075.54346: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.54398: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12372 1727204075.54527: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12372 1727204075.54532: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.54545: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.54743: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.54763: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.54801: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.54824: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.54897: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.54955: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.55015: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.55088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 12372 1727204075.55094: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.55229: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.55366: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.55415: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.55469: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 12372 1727204075.55479: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.55788: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.56112: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.56173: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.56266: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 12372 1727204075.56288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204075.56341: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 12372 1727204075.56358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 12372 1727204075.56412: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 12372 1727204075.56525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd5b20> <<< 12372 1727204075.56535: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 12372 1727204075.56569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 12372 1727204075.56578: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12372 1727204075.56761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d1082f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d1088c0> <<< 12372 1727204075.56791: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbb5340> <<< 12372 1727204075.56829: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbb42c0> <<< 12372 1727204075.56919: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd4200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd5be0> <<< 12372 1727204075.56965: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 12372 1727204075.57015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 12372 1727204075.57060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 12372 1727204075.57153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 12372 1727204075.57183: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d10b6b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d10af60> <<< 12372 1727204075.57254: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204075.57290: stdout chunk (state=3): >>>import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d10b140> <<< 12372 1727204075.57480: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d10a390> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12372 1727204075.57560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d10b860> <<< 12372 1727204075.57649: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 12372 1727204075.57697: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d172360> <<< 12372 1727204075.57742: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d170380> <<< 12372 1727204075.57830: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd7d40> <<< 12372 1727204075.57834: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 12372 1727204075.57842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 12372 1727204075.57884: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 12372 1727204075.57895: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.57987: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.58352: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.58402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 12372 1727204075.58406: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.58462: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.58536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 12372 1727204075.58564: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.58632: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.58707: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 12372 1727204075.58731: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.58817: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.58932: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.59020: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.59107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 12372 1727204075.59114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 12372 1727204075.59140: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.60017: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.60696: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 12372 1727204075.60704: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.60822: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.60908: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 12372 1727204075.60933: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 12372 1727204075.60986: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.61051: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 12372 1727204075.61062: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.61122: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.61175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 12372 1727204075.61200: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.61230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 12372 1727204075.61344: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.61440: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 12372 1727204075.61448: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d172570> <<< 12372 1727204075.61665: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12372 1727204075.61838: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d1730e0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.61927: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 12372 1727204075.61941: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.62092: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.62243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 12372 1727204075.62273: stdout chunk (state=3): >>> # zipimport: zlib available<<< 12372 1727204075.62280: stdout chunk (state=3): >>> <<< 12372 1727204075.62386: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.62440: stdout chunk (state=3): >>> <<< 12372 1727204075.62524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 12372 1727204075.62531: stdout chunk (state=3): >>> <<< 12372 1727204075.62559: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.62625: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.62712: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 12372 1727204075.62797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc'<<< 12372 1727204075.62803: stdout chunk (state=3): >>> <<< 12372 1727204075.62910: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.62917: stdout chunk (state=3): >>> <<< 12372 1727204075.63040: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204075.63044: stdout chunk (state=3): >>> <<< 12372 1727204075.63047: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d1a6630><<< 12372 1727204075.63065: stdout chunk (state=3): >>> <<< 12372 1727204075.63422: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d18f2f0><<< 12372 1727204075.63453: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.python' # <<< 12372 1727204075.63460: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12372 1727204075.63562: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.63698: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available<<< 12372 1727204075.63722: stdout chunk (state=3): >>> <<< 12372 1727204075.63839: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.63992: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.64219: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.64697: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12372 1727204075.64766: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943cfbdfa0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943cfbdb80> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 12372 1727204075.64824: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 12372 1727204075.65018: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 12372 1727204075.65086: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.65219: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 12372 1727204075.65241: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.65659: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.65663: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.65666: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.65669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 12372 1727204075.65674: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.65760: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.66049: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 12372 1727204075.66057: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.66400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 12372 1727204075.66404: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.66407: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.66409: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.66898: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.67488: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 12372 1727204075.67494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 12372 1727204075.67602: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.67718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 12372 1727204075.67736: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.67833: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.67951: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 12372 1727204075.67964: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.68124: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.68312: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 12372 1727204075.68471: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.68475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 12372 1727204075.68495: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.68543: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.68658: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.69053: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.69324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 12372 1727204075.69354: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.aix' # <<< 12372 1727204075.69383: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.69468: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.69542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 12372 1727204075.69558: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.69603: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.69656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available<<< 12372 1727204075.69764: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12372 1727204075.69888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 12372 1727204075.69924: stdout chunk (state=3): >>> <<< 12372 1727204075.69927: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.69929: stdout chunk (state=3): >>> <<< 12372 1727204075.69972: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.70034: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 12372 1727204075.70050: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.70152: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.70173: stdout chunk (state=3): >>> <<< 12372 1727204075.70266: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 12372 1727204075.70310: stdout chunk (state=3): >>> <<< 12372 1727204075.70312: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.70401: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.70435: stdout chunk (state=3): >>> <<< 12372 1727204075.70520: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12372 1727204075.70550: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.71054: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.71136: stdout chunk (state=3): >>> <<< 12372 1727204075.71575: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 12372 1727204075.71596: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.71649: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.71722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 12372 1727204075.71726: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.71785: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.71836: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.71928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 12372 1727204075.71995: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 12372 1727204075.72080: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.72199: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 12372 1727204075.72234: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 12372 1727204075.72286: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.72365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.72421: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204075.72465: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.72804: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 12372 1727204075.72895: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 12372 1727204075.73279: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.73788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 12372 1727204075.73806: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.73809: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.73953: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 12372 1727204075.73980: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.74042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 12372 1727204075.74064: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.74205: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204075.74404: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 12372 1727204075.74454: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.74609: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.74808: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12372 1727204075.74940: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204075.75876: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943cfe78f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943cfe5460> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943cfe5d90> <<< 12372 1727204075.76634: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "35", "epoch": "1727204075", "epoch_int": "1727204075", "date": "2024-09-24", "time": "14:54:35", "iso8601_micro": "2024-09-24T18:54:35.751331Z", "iso8601": "2024-09-24T18:54:35Z", "iso8601_basic": "20240924T145435751331", "iso8601_basic_short": "20240924T145435", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12372 1727204075.77339: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout <<< 12372 1727204075.77444: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize <<< 12372 1727204075.77472: stdout chunk (state=3): >>># cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 12372 1727204075.77618: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 12372 1727204075.77951: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12372 1727204075.78060: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 12372 1727204075.78133: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 12372 1727204075.78287: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 12372 1727204075.78343: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 12372 1727204075.78466: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 12372 1727204075.78476: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 12372 1727204075.78657: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix <<< 12372 1727204075.78693: stdout chunk (state=3): >>># destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 12372 1727204075.78720: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12372 1727204075.78846: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12372 1727204075.78927: stdout chunk (state=3): >>># destroy _collections <<< 12372 1727204075.78936: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 12372 1727204075.79036: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 12372 1727204075.79039: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12372 1727204075.79114: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 12372 1727204075.79221: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 12372 1727204075.79251: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 12372 1727204075.79777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 12372 1727204075.79781: stdout chunk (state=3): >>><<< 12372 1727204075.79783: stderr chunk (state=3): >>><<< 12372 1727204075.79961: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e2cc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e29bad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e2cea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e07d0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e07dfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0bbec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0bbf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0f38c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0f3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0d3b90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0d12b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0b9070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e117740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e116360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0d22a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0baf60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e148770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0b82f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e148c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e148ad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e148e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e0b6e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e149520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e149220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e14a420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e164650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e165d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e166c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e1672f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e1661e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943e167d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e1674a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e14a480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943de5bcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943de847a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de84500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943de847d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943de849b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de59e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de86060> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de84ce0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943e14a600> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943deb2420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943deca5a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943df03350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943df29af0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943df03470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943decb230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dd444a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dec95e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943de86fc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f943dd44770> # zipimport: found 103 names in '/tmp/ansible_setup_payload_75d3kfuy/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943ddb2240> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dd89130> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dd88290> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dd8b230> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dde1bb0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dde1940> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dde1250> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dde1ca0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943ddb2c60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dde2960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dde2ba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dde30b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc48e90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dc4aab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4b350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4c530> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4eff0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dc4f110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4d2b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc52f60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc51a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc51790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc53e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc4d7c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dc97050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc97200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca0dd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca0b90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca3320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca14c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca6a80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca3410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca77a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca4290> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dca7b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dc97500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dcab440> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dcac770> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca9bb0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943dcaaf60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca97c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943db348f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db35760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dcaf170> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db35790> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db367e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943db3e120> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943db3eab0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dca7350> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943db3d8e0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db3ecc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd2e10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db48b90> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db46d20> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943db46b70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd5b20> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d1082f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d1088c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbb5340> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbb42c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd4200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd5be0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d10b6b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d10af60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d10b140> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d10a390> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d10b860> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d172360> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d170380> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943dbd7d40> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d172570> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d1730e0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943d1a6630> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943d18f2f0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943cfbdfa0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943cfbdb80> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f943cfe78f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943cfe5460> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f943cfe5d90> {"ansible_facts": {"ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "35", "epoch": "1727204075", "epoch_int": "1727204075", "date": "2024-09-24", "time": "14:54:35", "iso8601_micro": "2024-09-24T18:54:35.751331Z", "iso8601": "2024-09-24T18:54:35Z", "iso8601_basic": "20240924T145435751331", "iso8601_basic_short": "20240924T145435", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12372 1727204075.81963: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12372 1727204075.81967: _low_level_execute_command(): starting 12372 1727204075.81970: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204075.0402052-12794-272177309588317/ > /dev/null 2>&1 && sleep 0' 12372 1727204075.82194: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204075.82198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204075.82273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.82340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12372 1727204075.82366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204075.82402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204075.82469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204075.84419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204075.84497: stderr chunk (state=3): >>><<< 12372 1727204075.84522: stdout chunk (state=3): >>><<< 12372 1727204075.84706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204075.84710: handler run complete 12372 1727204075.84713: variable 'ansible_facts' from source: unknown 12372 1727204075.84757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204075.85203: variable 'ansible_facts' from source: unknown 12372 1727204075.85341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204075.85534: attempt loop complete, returning result 12372 1727204075.85544: _execute() done 12372 1727204075.85552: dumping result to json 12372 1727204075.85571: done dumping result, returning 12372 1727204075.85584: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-244a-02f9-0000000001cd] 12372 1727204075.85609: sending task result for task 12b410aa-8751-244a-02f9-0000000001cd 12372 1727204075.85979: done sending task result for task 12b410aa-8751-244a-02f9-0000000001cd 12372 1727204075.85982: WORKER PROCESS EXITING ok: [managed-node3] 12372 1727204075.86160: no more pending results, returning what we have 12372 1727204075.86163: results queue empty 12372 1727204075.86164: checking for any_errors_fatal 12372 1727204075.86166: done checking for any_errors_fatal 12372 1727204075.86167: checking for max_fail_percentage 12372 1727204075.86169: done checking for max_fail_percentage 12372 1727204075.86170: checking to see if all hosts have failed and the running result is not ok 12372 1727204075.86171: done checking to see if all hosts have failed 12372 1727204075.86172: getting the remaining hosts for this loop 12372 1727204075.86173: done getting the remaining hosts for this loop 12372 1727204075.86179: getting the next task for host managed-node3 12372 1727204075.86529: done getting next task for host managed-node3 12372 1727204075.86534: ^ task is: TASK: Check if system is ostree 12372 1727204075.86538: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204075.86549: getting variables 12372 1727204075.86551: in VariableManager get_vars() 12372 1727204075.86605: Calling all_inventory to load vars for managed-node3 12372 1727204075.86612: Calling groups_inventory to load vars for managed-node3 12372 1727204075.86687: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204075.86701: Calling all_plugins_play to load vars for managed-node3 12372 1727204075.86705: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204075.86708: Calling groups_plugins_play to load vars for managed-node3 12372 1727204075.87096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204075.87388: done with get_vars() 12372 1727204075.87404: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:54:35 -0400 (0:00:01.004) 0:00:02.860 ***** 12372 1727204075.87526: entering _queue_task() for managed-node3/stat 12372 1727204075.87883: worker is 1 (out of 1 available) 12372 1727204075.87898: exiting _queue_task() for managed-node3/stat 12372 1727204075.87911: done queuing things up, now waiting for results queue to drain 12372 1727204075.87914: waiting for pending results... 12372 1727204075.88526: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 12372 1727204075.88532: in run() - task 12b410aa-8751-244a-02f9-0000000001cf 12372 1727204075.88535: variable 'ansible_search_path' from source: unknown 12372 1727204075.88542: variable 'ansible_search_path' from source: unknown 12372 1727204075.88546: calling self._execute() 12372 1727204075.88549: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204075.88552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204075.88555: variable 'omit' from source: magic vars 12372 1727204075.89063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12372 1727204075.89420: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12372 1727204075.89495: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12372 1727204075.89614: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12372 1727204075.89618: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12372 1727204075.89676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12372 1727204075.89844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12372 1727204075.89880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204075.89921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12372 1727204075.90313: Evaluated conditional (not __network_is_ostree is defined): True 12372 1727204075.90323: variable 'omit' from source: magic vars 12372 1727204075.90696: variable 'omit' from source: magic vars 12372 1727204075.90700: variable 'omit' from source: magic vars 12372 1727204075.90703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12372 1727204075.90740: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12372 1727204075.90768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12372 1727204075.90833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204075.90922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204075.90962: variable 'inventory_hostname' from source: host vars for 'managed-node3' 12372 1727204075.91032: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204075.91042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204075.91209: Set connection var ansible_connection to ssh 12372 1727204075.91458: Set connection var ansible_timeout to 10 12372 1727204075.91461: Set connection var ansible_module_compression to ZIP_DEFLATED 12372 1727204075.91464: Set connection var ansible_shell_executable to /bin/sh 12372 1727204075.91466: Set connection var ansible_shell_type to sh 12372 1727204075.91469: Set connection var ansible_pipelining to False 12372 1727204075.91471: variable 'ansible_shell_executable' from source: unknown 12372 1727204075.91473: variable 'ansible_connection' from source: unknown 12372 1727204075.91475: variable 'ansible_module_compression' from source: unknown 12372 1727204075.91477: variable 'ansible_shell_type' from source: unknown 12372 1727204075.91479: variable 'ansible_shell_executable' from source: unknown 12372 1727204075.91481: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204075.91483: variable 'ansible_pipelining' from source: unknown 12372 1727204075.91485: variable 'ansible_timeout' from source: unknown 12372 1727204075.91568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204075.91933: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12372 1727204075.91955: variable 'omit' from source: magic vars 12372 1727204075.91993: starting attempt loop 12372 1727204075.92006: running the handler 12372 1727204075.92030: _low_level_execute_command(): starting 12372 1727204075.92045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12372 1727204075.92888: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.92963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204075.93001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204075.93074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204075.95265: stdout chunk (state=3): >>>/root <<< 12372 1727204075.95269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204075.95272: stdout chunk (state=3): >>><<< 12372 1727204075.95274: stderr chunk (state=3): >>><<< 12372 1727204075.95277: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204075.95286: _low_level_execute_command(): starting 12372 1727204075.95291: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986 `" && echo ansible-tmp-1727204075.9517457-12837-253670546599986="` echo /root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986 `" ) && sleep 0' 12372 1727204075.96485: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204075.96493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.96497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204075.96499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204075.96748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12372 1727204075.96774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204075.96820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12372 1727204075.98829: stdout chunk (state=3): >>>ansible-tmp-1727204075.9517457-12837-253670546599986=/root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986 <<< 12372 1727204075.99020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204075.99197: stderr chunk (state=3): >>><<< 12372 1727204075.99202: stdout chunk (state=3): >>><<< 12372 1727204075.99205: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204075.9517457-12837-253670546599986=/root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12372 1727204075.99324: variable 'ansible_module_compression' from source: unknown 12372 1727204075.99379: ANSIBALLZ: Using lock for stat 12372 1727204075.99424: ANSIBALLZ: Acquiring lock 12372 1727204075.99654: ANSIBALLZ: Lock acquired: 140438065238896 12372 1727204075.99657: ANSIBALLZ: Creating module 12372 1727204076.19071: ANSIBALLZ: Writing module into payload 12372 1727204076.19206: ANSIBALLZ: Writing module 12372 1727204076.19238: ANSIBALLZ: Renaming module 12372 1727204076.19255: ANSIBALLZ: Done creating module 12372 1727204076.19278: variable 'ansible_facts' from source: unknown 12372 1727204076.19371: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/AnsiballZ_stat.py 12372 1727204076.19626: Sending initial data 12372 1727204076.19630: Sent initial data (153 bytes) 12372 1727204076.20222: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204076.20238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204076.20305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204076.20363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12372 1727204076.20447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204076.20515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12372 1727204076.22888: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12372 1727204076.22955: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12372 1727204076.23018: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12372u51ts529/tmpijlie351 /root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/AnsiballZ_stat.py <<< 12372 1727204076.23056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12372u51ts529/tmpijlie351" to remote "/root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/AnsiballZ_stat.py" <<< 12372 1727204076.24449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204076.24509: stderr chunk (state=3): >>><<< 12372 1727204076.24536: stdout chunk (state=3): >>><<< 12372 1727204076.24578: done transferring module to remote 12372 1727204076.24605: _low_level_execute_command(): starting 12372 1727204076.24620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/ /root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/AnsiballZ_stat.py && sleep 0' 12372 1727204076.25369: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204076.25373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12372 1727204076.25377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204076.25402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204076.25405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204076.25468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204076.25483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204076.25527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12372 1727204076.28238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204076.28288: stderr chunk (state=3): >>><<< 12372 1727204076.28291: stdout chunk (state=3): >>><<< 12372 1727204076.28309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 12372 1727204076.28313: _low_level_execute_command(): starting 12372 1727204076.28321: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/AnsiballZ_stat.py && sleep 0' 12372 1727204076.28975: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12372 1727204076.28978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204076.29002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12372 1727204076.29005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12372 1727204076.29061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204076.29068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204076.29117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12372 1727204076.36320: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951220c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95121dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951220ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9512021fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache_<<< 12372 1727204076.36413: stdout chunk (state=3): >>>_/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120978c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9512097f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9512077b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120752b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120bb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ba4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120762a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120b8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ec800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' execu<<< 12372 1727204076.36418: stdout chunk (state=3): >>>ted from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95120eccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ecb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95120ecf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ed610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ed2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ee510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9512108740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9512109e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951210ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f951210b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951210a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f951210be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951210b560> import 'shutil' # <_froze<<< 12372 1727204076.36582: stdout chunk (state=3): >>>n_importlib_external.SourceFileLoader object at 0x7f95120ee570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511ee3d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511f0c860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f0c5c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511f0c890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511f0ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511ee1ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f0e180> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f0ce00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120eec60><<< 12372 1727204076.36860: stdout chunk (state=3): >>> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 12372 1727204076.36865: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f36510> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204076.36868: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py<<< 12372 1727204076.37112: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f52690> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12372 1727204076.37218: stdout chunk (state=3): >>>import 'ntpath' # <<< 12372 1727204076.37276: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 12372 1727204076.37298: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f87410><<< 12372 1727204076.37327: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12372 1727204076.37400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12372 1727204076.37502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 12372 1727204076.37523: stdout chunk (state=3): >>> <<< 12372 1727204076.37714: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511fb1bb0> <<< 12372 1727204076.37802: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f87530><<< 12372 1727204076.37825: stdout chunk (state=3): >>> <<< 12372 1727204076.38011: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f53320> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d8c4a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f516d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f0f0b0><<< 12372 1727204076.38136: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 12372 1727204076.38184: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9511f517f0> <<< 12372 1727204076.38307: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_c_p8vvtn/ansible_stat_payload.zip' <<< 12372 1727204076.38333: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.38609: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.38630: stdout chunk (state=3): >>> <<< 12372 1727204076.38654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12372 1727204076.38678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12372 1727204076.38753: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12372 1727204076.38876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12372 1727204076.38927: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 12372 1727204076.38967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511de60c0> <<< 12372 1727204076.38985: stdout chunk (state=3): >>>import '_typing' # <<< 12372 1727204076.39141: stdout chunk (state=3): >>> <<< 12372 1727204076.39413: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511dbd040> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511dbc1a0> # zipimport: zlib available import 'ansible' # <<< 12372 1727204076.39445: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.39484: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.39510: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.39532: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 12372 1727204076.39564: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.39582: stdout chunk (state=3): >>> <<< 12372 1727204076.42119: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.44302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 12372 1727204076.44339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc'<<< 12372 1727204076.44369: stdout chunk (state=3): >>> import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511dbffe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 12372 1727204076.44407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204076.44443: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc'<<< 12372 1727204076.44497: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12372 1727204076.44558: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.44572: stdout chunk (state=3): >>> # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511e11b80><<< 12372 1727204076.44625: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511e11910><<< 12372 1727204076.44664: stdout chunk (state=3): >>> <<< 12372 1727204076.44685: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511e11220> <<< 12372 1727204076.44722: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 12372 1727204076.44741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12372 1727204076.44804: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511e11c70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511de6b70><<< 12372 1727204076.44825: stdout chunk (state=3): >>> import 'atexit' # <<< 12372 1727204076.44884: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.44926: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511e12930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.44956: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511e12b70><<< 12372 1727204076.44995: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12372 1727204076.45076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12372 1727204076.45159: stdout chunk (state=3): >>>import '_locale' # <<< 12372 1727204076.45184: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511e130b0> <<< 12372 1727204076.45232: stdout chunk (state=3): >>>import 'pwd' # <<< 12372 1727204076.45249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12372 1727204076.45360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c74e90> <<< 12372 1727204076.45428: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.45449: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511c76ab0> <<< 12372 1727204076.45481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 12372 1727204076.45503: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12372 1727204076.45575: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c773b0> <<< 12372 1727204076.45651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12372 1727204076.45697: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c78590> <<< 12372 1727204076.45727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 12372 1727204076.45795: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12372 1727204076.45840: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 12372 1727204076.45855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12372 1727204076.45963: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7b080> <<< 12372 1727204076.46015: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.46052: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511c7b1d0> <<< 12372 1727204076.46074: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c79340> <<< 12372 1727204076.46151: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12372 1727204076.46230: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 12372 1727204076.46285: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12372 1727204076.46337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 12372 1727204076.46362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 12372 1727204076.46400: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7eff0> import '_tokenize' # <<< 12372 1727204076.46507: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7dac0><<< 12372 1727204076.46555: stdout chunk (state=3): >>> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7d820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 12372 1727204076.46585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12372 1727204076.46714: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7fec0><<< 12372 1727204076.46753: stdout chunk (state=3): >>> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c79850> <<< 12372 1727204076.46811: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.46815: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.46866: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cc7110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 12372 1727204076.46879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204076.46886: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cc7290><<< 12372 1727204076.46925: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 12372 1727204076.46931: stdout chunk (state=3): >>> <<< 12372 1727204076.46973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 12372 1727204076.46979: stdout chunk (state=3): >>> <<< 12372 1727204076.47006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 12372 1727204076.47063: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.47094: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.47098: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cc8e60> <<< 12372 1727204076.47118: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cc8c20><<< 12372 1727204076.47146: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12372 1727204076.47331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 12372 1727204076.47337: stdout chunk (state=3): >>> <<< 12372 1727204076.47402: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.47426: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511ccb3b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cc9550><<< 12372 1727204076.47483: stdout chunk (state=3): >>> <<< 12372 1727204076.47490: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12372 1727204076.47575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 12372 1727204076.47612: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 12372 1727204076.47619: stdout chunk (state=3): >>> <<< 12372 1727204076.47647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 12372 1727204076.47654: stdout chunk (state=3): >>> <<< 12372 1727204076.47669: stdout chunk (state=3): >>>import '_string' # <<< 12372 1727204076.47759: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cd2ba0><<< 12372 1727204076.47765: stdout chunk (state=3): >>> <<< 12372 1727204076.48030: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511ccb530><<< 12372 1727204076.48036: stdout chunk (state=3): >>> <<< 12372 1727204076.48152: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.48158: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.48175: stdout chunk (state=3): >>> <<< 12372 1727204076.48179: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd3e60> <<< 12372 1727204076.48232: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.48248: stdout chunk (state=3): >>> <<< 12372 1727204076.48258: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd3a10><<< 12372 1727204076.48360: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.48364: stdout chunk (state=3): >>> <<< 12372 1727204076.48366: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.48369: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd3f20><<< 12372 1727204076.48382: stdout chunk (state=3): >>> <<< 12372 1727204076.48407: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cc7590> <<< 12372 1727204076.48450: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 12372 1727204076.48466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 12372 1727204076.48504: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 12372 1727204076.48512: stdout chunk (state=3): >>> <<< 12372 1727204076.48558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12372 1727204076.48666: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.48671: stdout chunk (state=3): >>> <<< 12372 1727204076.48685: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd7680><<< 12372 1727204076.48694: stdout chunk (state=3): >>> <<< 12372 1727204076.49016: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.49056: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd8650><<< 12372 1727204076.49080: stdout chunk (state=3): >>> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cd5df0> <<< 12372 1727204076.49126: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.49158: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd7170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cd59d0><<< 12372 1727204076.49187: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12372 1727204076.49206: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.49241: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available<<< 12372 1727204076.49252: stdout chunk (state=3): >>> <<< 12372 1727204076.49458: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.49610: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204076.49649: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 12372 1727204076.49679: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.49681: stdout chunk (state=3): >>> <<< 12372 1727204076.49723: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available<<< 12372 1727204076.49726: stdout chunk (state=3): >>> <<< 12372 1727204076.49967: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.50038: stdout chunk (state=3): >>> <<< 12372 1727204076.50217: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.51409: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.51417: stdout chunk (state=3): >>> <<< 12372 1727204076.52624: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 12372 1727204076.52659: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 12372 1727204076.52685: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 12372 1727204076.52754: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 12372 1727204076.52807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204076.52879: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.52906: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511d60830><<< 12372 1727204076.53079: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 12372 1727204076.53111: stdout chunk (state=3): >>> <<< 12372 1727204076.53114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc'<<< 12372 1727204076.53118: stdout chunk (state=3): >>> <<< 12372 1727204076.53149: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d61580><<< 12372 1727204076.53155: stdout chunk (state=3): >>> <<< 12372 1727204076.53178: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7da30><<< 12372 1727204076.53184: stdout chunk (state=3): >>> <<< 12372 1727204076.53261: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12372 1727204076.53296: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.53343: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.53373: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 12372 1727204076.53405: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.53711: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.54013: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 12372 1727204076.54044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 12372 1727204076.54068: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d61340><<< 12372 1727204076.54103: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12372 1727204076.55112: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.55244: stdout chunk (state=3): >>> <<< 12372 1727204076.56078: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.56231: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.56239: stdout chunk (state=3): >>> <<< 12372 1727204076.56398: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12372 1727204076.56424: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.56505: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.56624: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 12372 1727204076.56752: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.56946: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12372 1727204076.56977: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.57012: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12372 1727204076.57041: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 12372 1727204076.57066: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12372 1727204076.57159: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.57245: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 12372 1727204076.57840: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.58219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12372 1727204076.58358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12372 1727204076.58396: stdout chunk (state=3): >>>import '_ast' # <<< 12372 1727204076.58542: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d63f50> <<< 12372 1727204076.58571: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.58732: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.58917: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 12372 1727204076.58931: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 12372 1727204076.58968: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py<<< 12372 1727204076.58995: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 12372 1727204076.59094: stdout chunk (state=3): >>> # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.59103: stdout chunk (state=3): >>> <<< 12372 1727204076.59318: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511b6df10> <<< 12372 1727204076.59414: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.59466: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511b6e840> <<< 12372 1727204076.59490: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d63290> <<< 12372 1727204076.59515: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.59596: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.59673: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 12372 1727204076.59706: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.59803: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.59889: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.60127: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12372 1727204076.60219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204076.60400: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 12372 1727204076.60425: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 12372 1727204076.60435: stdout chunk (state=3): >>> import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511b6d6d0><<< 12372 1727204076.60445: stdout chunk (state=3): >>> <<< 12372 1727204076.60526: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511b6ea80> <<< 12372 1727204076.60596: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 12372 1727204076.60611: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 12372 1727204076.60639: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.60764: stdout chunk (state=3): >>> # zipimport: zlib available<<< 12372 1727204076.60774: stdout chunk (state=3): >>> <<< 12372 1727204076.60893: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.60897: stdout chunk (state=3): >>> <<< 12372 1727204076.60946: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.60954: stdout chunk (state=3): >>> <<< 12372 1727204076.61059: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 12372 1727204076.61075: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12372 1727204076.61104: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 12372 1727204076.61149: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12372 1727204076.61179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 12372 1727204076.61202: stdout chunk (state=3): >>> <<< 12372 1727204076.61344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12372 1727204076.61349: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12372 1727204076.61460: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511bfed50><<< 12372 1727204076.61546: stdout chunk (state=3): >>> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511b78b30> <<< 12372 1727204076.61910: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511b76b70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511b769c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12372 1727204076.61969: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12372 1727204076.62011: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12372 1727204076.62057: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available<<< 12372 1727204076.62060: stdout chunk (state=3): >>> <<< 12372 1727204076.62298: stdout chunk (state=3): >>># zipimport: zlib available<<< 12372 1727204076.62440: stdout chunk (state=3): >>> <<< 12372 1727204076.62664: stdout chunk (state=3): >>># zipimport: zlib available <<< 12372 1727204076.62877: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 12372 1727204076.62936: stdout chunk (state=3): >>># destroy __main__<<< 12372 1727204076.62949: stdout chunk (state=3): >>> <<< 12372 1727204076.63440: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 12372 1727204076.63496: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ <<< 12372 1727204076.63559: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc<<< 12372 1727204076.63563: stdout chunk (state=3): >>> # clear sys.last_type<<< 12372 1727204076.63618: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref<<< 12372 1727204076.63626: stdout chunk (state=3): >>> # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport<<< 12372 1727204076.63666: stdout chunk (state=3): >>> # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path <<< 12372 1727204076.63670: stdout chunk (state=3): >>># cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator <<< 12372 1727204076.63706: stdout chunk (state=3): >>># cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser<<< 12372 1727204076.63747: stdout chunk (state=3): >>> # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc<<< 12372 1727204076.63859: stdout chunk (state=3): >>> # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib<<< 12372 1727204076.63865: stdout chunk (state=3): >>> # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils<<< 12372 1727204076.64044: stdout chunk (state=3): >>> # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon<<< 12372 1727204076.64123: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections<<< 12372 1727204076.64192: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux<<< 12372 1727204076.64222: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro<<< 12372 1727204076.64350: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 12372 1727204076.64667: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery <<< 12372 1727204076.64699: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util <<< 12372 1727204076.64894: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 12372 1727204076.65055: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon <<< 12372 1727204076.65148: stdout chunk (state=3): >>># cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 12372 1727204076.65199: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 12372 1727204076.65295: stdout chunk (state=3): >>> # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 12372 1727204076.65403: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 12372 1727204076.65457: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux<<< 12372 1727204076.65570: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12372 1727204076.65725: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12372 1727204076.65848: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 12372 1727204076.65911: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg <<< 12372 1727204076.65988: stdout chunk (state=3): >>># destroy contextlib # destroy _typing # destroy _tokenize <<< 12372 1727204076.65994: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse <<< 12372 1727204076.66077: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 12372 1727204076.66107: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 12372 1727204076.66315: stdout chunk (state=3): >>> # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 12372 1727204076.66332: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time <<< 12372 1727204076.66422: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 <<< 12372 1727204076.66477: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools <<< 12372 1727204076.66506: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 12372 1727204076.66579: stdout chunk (state=3): >>> # clear sys.audit hooks <<< 12372 1727204076.67199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 12372 1727204076.67203: stdout chunk (state=3): >>><<< 12372 1727204076.67212: stderr chunk (state=3): >>><<< 12372 1727204076.67346: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951220c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95121dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951220ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9512021fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120978c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9512097f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9512077b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120752b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120bb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ba4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120762a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120b8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ec800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95120eccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ecb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95120ecf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951205ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ed610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ed2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ee510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9512108740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9512109e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951210ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f951210b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951210a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f951210be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f951210b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120ee570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511ee3d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511f0c860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f0c5c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511f0c890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511f0ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511ee1ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f0e180> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f0ce00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95120eec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f36510> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f52690> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f87410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511fb1bb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f87530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f53320> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d8c4a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f516d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511f0f0b0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9511f517f0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_c_p8vvtn/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511de60c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511dbd040> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511dbc1a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511dbffe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511e11b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511e11910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511e11220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511e11c70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511de6b70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511e12930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511e12b70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511e130b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c74e90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511c76ab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c773b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c78590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7b080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511c7b1d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c79340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7eff0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7dac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7d820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7fec0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c79850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cc7110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cc7290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cc8e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cc8c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511ccb3b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cc9550> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cd2ba0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511ccb530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd3e60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd3a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd3f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cc7590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd7680> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd8650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cd5df0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511cd7170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511cd59d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511d60830> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d61580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511c7da30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d61340> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d63f50> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511b6df10> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511b6e840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511d63290> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9511b6d6d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511b6ea80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511bfed50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511b78b30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511b76b70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9511b769c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12372 1727204076.68552: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12372 1727204076.68555: _low_level_execute_command(): starting 12372 1727204076.68558: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204075.9517457-12837-253670546599986/ > /dev/null 2>&1 && sleep 0' 12372 1727204076.68968: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12372 1727204076.68972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12372 1727204076.68991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12372 1727204076.69079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12372 1727204076.71879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12372 1727204076.71999: stderr chunk (state=3): >>><<< 12372 1727204076.72015: stdout chunk (state=3): >>><<< 12372 1727204076.72160: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 12372 1727204076.72167: handler run complete 12372 1727204076.72205: attempt loop complete, returning result 12372 1727204076.72208: _execute() done 12372 1727204076.72218: dumping result to json 12372 1727204076.72221: done dumping result, returning 12372 1727204076.72232: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [12b410aa-8751-244a-02f9-0000000001cf] 12372 1727204076.72240: sending task result for task 12b410aa-8751-244a-02f9-0000000001cf 12372 1727204076.72594: done sending task result for task 12b410aa-8751-244a-02f9-0000000001cf 12372 1727204076.72597: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 12372 1727204076.72681: no more pending results, returning what we have 12372 1727204076.72685: results queue empty 12372 1727204076.72686: checking for any_errors_fatal 12372 1727204076.72699: done checking for any_errors_fatal 12372 1727204076.72700: checking for max_fail_percentage 12372 1727204076.72702: done checking for max_fail_percentage 12372 1727204076.72703: checking to see if all hosts have failed and the running result is not ok 12372 1727204076.72704: done checking to see if all hosts have failed 12372 1727204076.72705: getting the remaining hosts for this loop 12372 1727204076.72706: done getting the remaining hosts for this loop 12372 1727204076.72711: getting the next task for host managed-node3 12372 1727204076.72720: done getting next task for host managed-node3 12372 1727204076.72724: ^ task is: TASK: Set flag to indicate system is ostree 12372 1727204076.72726: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204076.72730: getting variables 12372 1727204076.72732: in VariableManager get_vars() 12372 1727204076.72765: Calling all_inventory to load vars for managed-node3 12372 1727204076.72768: Calling groups_inventory to load vars for managed-node3 12372 1727204076.72773: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204076.72785: Calling all_plugins_play to load vars for managed-node3 12372 1727204076.72991: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204076.73000: Calling groups_plugins_play to load vars for managed-node3 12372 1727204076.73264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204076.73919: done with get_vars() 12372 1727204076.73932: done getting variables 12372 1727204076.74140: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.866) 0:00:03.727 ***** 12372 1727204076.74177: entering _queue_task() for managed-node3/set_fact 12372 1727204076.74180: Creating lock for set_fact 12372 1727204076.74609: worker is 1 (out of 1 available) 12372 1727204076.74625: exiting _queue_task() for managed-node3/set_fact 12372 1727204076.74639: done queuing things up, now waiting for results queue to drain 12372 1727204076.74641: waiting for pending results... 12372 1727204076.74809: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 12372 1727204076.74927: in run() - task 12b410aa-8751-244a-02f9-0000000001d0 12372 1727204076.74942: variable 'ansible_search_path' from source: unknown 12372 1727204076.74946: variable 'ansible_search_path' from source: unknown 12372 1727204076.74994: calling self._execute() 12372 1727204076.75078: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204076.75097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204076.75115: variable 'omit' from source: magic vars 12372 1727204076.75603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12372 1727204076.76086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12372 1727204076.76321: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12372 1727204076.76325: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12372 1727204076.76380: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12372 1727204076.76519: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12372 1727204076.76560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12372 1727204076.76597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204076.76644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12372 1727204076.76860: Evaluated conditional (not __network_is_ostree is defined): True 12372 1727204076.76871: variable 'omit' from source: magic vars 12372 1727204076.76927: variable 'omit' from source: magic vars 12372 1727204076.77090: variable '__ostree_booted_stat' from source: set_fact 12372 1727204076.77149: variable 'omit' from source: magic vars 12372 1727204076.77192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12372 1727204076.77229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12372 1727204076.77253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12372 1727204076.77318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204076.77396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204076.77399: variable 'inventory_hostname' from source: host vars for 'managed-node3' 12372 1727204076.77402: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204076.77404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204076.77510: Set connection var ansible_connection to ssh 12372 1727204076.77538: Set connection var ansible_timeout to 10 12372 1727204076.77551: Set connection var ansible_module_compression to ZIP_DEFLATED 12372 1727204076.77563: Set connection var ansible_shell_executable to /bin/sh 12372 1727204076.77571: Set connection var ansible_shell_type to sh 12372 1727204076.77594: Set connection var ansible_pipelining to False 12372 1727204076.77694: variable 'ansible_shell_executable' from source: unknown 12372 1727204076.77699: variable 'ansible_connection' from source: unknown 12372 1727204076.77702: variable 'ansible_module_compression' from source: unknown 12372 1727204076.77705: variable 'ansible_shell_type' from source: unknown 12372 1727204076.77708: variable 'ansible_shell_executable' from source: unknown 12372 1727204076.77711: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204076.77713: variable 'ansible_pipelining' from source: unknown 12372 1727204076.77716: variable 'ansible_timeout' from source: unknown 12372 1727204076.77719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204076.78110: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12372 1727204076.78114: variable 'omit' from source: magic vars 12372 1727204076.78116: starting attempt loop 12372 1727204076.78118: running the handler 12372 1727204076.78121: handler run complete 12372 1727204076.78123: attempt loop complete, returning result 12372 1727204076.78125: _execute() done 12372 1727204076.78127: dumping result to json 12372 1727204076.78131: done dumping result, returning 12372 1727204076.78134: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [12b410aa-8751-244a-02f9-0000000001d0] 12372 1727204076.78165: sending task result for task 12b410aa-8751-244a-02f9-0000000001d0 12372 1727204076.78471: done sending task result for task 12b410aa-8751-244a-02f9-0000000001d0 12372 1727204076.78476: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 12372 1727204076.78549: no more pending results, returning what we have 12372 1727204076.78552: results queue empty 12372 1727204076.78554: checking for any_errors_fatal 12372 1727204076.78562: done checking for any_errors_fatal 12372 1727204076.78563: checking for max_fail_percentage 12372 1727204076.78564: done checking for max_fail_percentage 12372 1727204076.78565: checking to see if all hosts have failed and the running result is not ok 12372 1727204076.78566: done checking to see if all hosts have failed 12372 1727204076.78567: getting the remaining hosts for this loop 12372 1727204076.78568: done getting the remaining hosts for this loop 12372 1727204076.78573: getting the next task for host managed-node3 12372 1727204076.78583: done getting next task for host managed-node3 12372 1727204076.78586: ^ task is: TASK: Fix CentOS6 Base repo 12372 1727204076.78591: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204076.78595: getting variables 12372 1727204076.78597: in VariableManager get_vars() 12372 1727204076.78627: Calling all_inventory to load vars for managed-node3 12372 1727204076.78630: Calling groups_inventory to load vars for managed-node3 12372 1727204076.78634: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204076.78646: Calling all_plugins_play to load vars for managed-node3 12372 1727204076.78649: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204076.78658: Calling groups_plugins_play to load vars for managed-node3 12372 1727204076.78979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204076.79684: done with get_vars() 12372 1727204076.79704: done getting variables 12372 1727204076.80070: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.059) 0:00:03.786 ***** 12372 1727204076.80107: entering _queue_task() for managed-node3/copy 12372 1727204076.80833: worker is 1 (out of 1 available) 12372 1727204076.80847: exiting _queue_task() for managed-node3/copy 12372 1727204076.80860: done queuing things up, now waiting for results queue to drain 12372 1727204076.80862: waiting for pending results... 12372 1727204076.81633: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 12372 1727204076.81687: in run() - task 12b410aa-8751-244a-02f9-0000000001d2 12372 1727204076.81712: variable 'ansible_search_path' from source: unknown 12372 1727204076.81731: variable 'ansible_search_path' from source: unknown 12372 1727204076.81866: calling self._execute() 12372 1727204076.82063: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204076.82122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204076.82202: variable 'omit' from source: magic vars 12372 1727204076.84042: variable 'ansible_distribution' from source: facts 12372 1727204076.84073: Evaluated conditional (ansible_distribution == 'CentOS'): False 12372 1727204076.84084: when evaluation is False, skipping this task 12372 1727204076.84497: _execute() done 12372 1727204076.84503: dumping result to json 12372 1727204076.84506: done dumping result, returning 12372 1727204076.84509: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [12b410aa-8751-244a-02f9-0000000001d2] 12372 1727204076.84512: sending task result for task 12b410aa-8751-244a-02f9-0000000001d2 12372 1727204076.84600: done sending task result for task 12b410aa-8751-244a-02f9-0000000001d2 12372 1727204076.84604: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 12372 1727204076.84721: no more pending results, returning what we have 12372 1727204076.84725: results queue empty 12372 1727204076.84726: checking for any_errors_fatal 12372 1727204076.84731: done checking for any_errors_fatal 12372 1727204076.84732: checking for max_fail_percentage 12372 1727204076.84734: done checking for max_fail_percentage 12372 1727204076.84735: checking to see if all hosts have failed and the running result is not ok 12372 1727204076.84736: done checking to see if all hosts have failed 12372 1727204076.84737: getting the remaining hosts for this loop 12372 1727204076.84738: done getting the remaining hosts for this loop 12372 1727204076.84742: getting the next task for host managed-node3 12372 1727204076.84750: done getting next task for host managed-node3 12372 1727204076.84753: ^ task is: TASK: Include the task 'enable_epel.yml' 12372 1727204076.84756: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204076.84759: getting variables 12372 1727204076.84761: in VariableManager get_vars() 12372 1727204076.84787: Calling all_inventory to load vars for managed-node3 12372 1727204076.84792: Calling groups_inventory to load vars for managed-node3 12372 1727204076.84796: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204076.84806: Calling all_plugins_play to load vars for managed-node3 12372 1727204076.84809: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204076.84813: Calling groups_plugins_play to load vars for managed-node3 12372 1727204076.85560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204076.86395: done with get_vars() 12372 1727204076.86408: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.065) 0:00:03.852 ***** 12372 1727204076.86666: entering _queue_task() for managed-node3/include_tasks 12372 1727204076.87326: worker is 1 (out of 1 available) 12372 1727204076.87405: exiting _queue_task() for managed-node3/include_tasks 12372 1727204076.87420: done queuing things up, now waiting for results queue to drain 12372 1727204076.87423: waiting for pending results... 12372 1727204076.88026: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 12372 1727204076.88033: in run() - task 12b410aa-8751-244a-02f9-0000000001d3 12372 1727204076.88273: variable 'ansible_search_path' from source: unknown 12372 1727204076.88278: variable 'ansible_search_path' from source: unknown 12372 1727204076.88329: calling self._execute() 12372 1727204076.88424: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204076.88443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204076.88461: variable 'omit' from source: magic vars 12372 1727204076.89024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204076.92021: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204076.92026: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204076.92077: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204076.92140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204076.92182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204076.92294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204076.92344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204076.92395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204076.92492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204076.92496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204076.92647: variable '__network_is_ostree' from source: set_fact 12372 1727204076.92682: Evaluated conditional (not __network_is_ostree | d(false)): True 12372 1727204076.92700: _execute() done 12372 1727204076.92780: dumping result to json 12372 1727204076.92784: done dumping result, returning 12372 1727204076.92791: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-244a-02f9-0000000001d3] 12372 1727204076.92794: sending task result for task 12b410aa-8751-244a-02f9-0000000001d3 12372 1727204076.92919: no more pending results, returning what we have 12372 1727204076.92926: in VariableManager get_vars() 12372 1727204076.92976: Calling all_inventory to load vars for managed-node3 12372 1727204076.92980: Calling groups_inventory to load vars for managed-node3 12372 1727204076.92986: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204076.93005: Calling all_plugins_play to load vars for managed-node3 12372 1727204076.93009: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204076.93014: Calling groups_plugins_play to load vars for managed-node3 12372 1727204076.93573: done sending task result for task 12b410aa-8751-244a-02f9-0000000001d3 12372 1727204076.93577: WORKER PROCESS EXITING 12372 1727204076.93605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204076.93891: done with get_vars() 12372 1727204076.93901: variable 'ansible_search_path' from source: unknown 12372 1727204076.93903: variable 'ansible_search_path' from source: unknown 12372 1727204076.93953: we have included files to process 12372 1727204076.93955: generating all_blocks data 12372 1727204076.93957: done generating all_blocks data 12372 1727204076.93968: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12372 1727204076.93969: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12372 1727204076.93973: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12372 1727204076.95035: done processing included file 12372 1727204076.95037: iterating over new_blocks loaded from include file 12372 1727204076.95039: in VariableManager get_vars() 12372 1727204076.95051: done with get_vars() 12372 1727204076.95052: filtering new block on tags 12372 1727204076.95087: done filtering new block on tags 12372 1727204076.95093: in VariableManager get_vars() 12372 1727204076.95103: done with get_vars() 12372 1727204076.95104: filtering new block on tags 12372 1727204076.95114: done filtering new block on tags 12372 1727204076.95119: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 12372 1727204076.95125: extending task lists for all hosts with included blocks 12372 1727204076.95231: done extending task lists 12372 1727204076.95233: done processing included files 12372 1727204076.95234: results queue empty 12372 1727204076.95235: checking for any_errors_fatal 12372 1727204076.95238: done checking for any_errors_fatal 12372 1727204076.95239: checking for max_fail_percentage 12372 1727204076.95240: done checking for max_fail_percentage 12372 1727204076.95240: checking to see if all hosts have failed and the running result is not ok 12372 1727204076.95241: done checking to see if all hosts have failed 12372 1727204076.95241: getting the remaining hosts for this loop 12372 1727204076.95242: done getting the remaining hosts for this loop 12372 1727204076.95244: getting the next task for host managed-node3 12372 1727204076.95247: done getting next task for host managed-node3 12372 1727204076.95249: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 12372 1727204076.95251: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204076.95252: getting variables 12372 1727204076.95253: in VariableManager get_vars() 12372 1727204076.95259: Calling all_inventory to load vars for managed-node3 12372 1727204076.95261: Calling groups_inventory to load vars for managed-node3 12372 1727204076.95263: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204076.95267: Calling all_plugins_play to load vars for managed-node3 12372 1727204076.95272: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204076.95276: Calling groups_plugins_play to load vars for managed-node3 12372 1727204076.95415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204076.95574: done with get_vars() 12372 1727204076.95582: done getting variables 12372 1727204076.95667: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 12372 1727204076.95802: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.091) 0:00:03.944 ***** 12372 1727204076.95862: entering _queue_task() for managed-node3/command 12372 1727204076.95865: Creating lock for command 12372 1727204076.96127: worker is 1 (out of 1 available) 12372 1727204076.96143: exiting _queue_task() for managed-node3/command 12372 1727204076.96156: done queuing things up, now waiting for results queue to drain 12372 1727204076.96158: waiting for pending results... 12372 1727204076.96341: running TaskExecutor() for managed-node3/TASK: Create EPEL 39 12372 1727204076.96437: in run() - task 12b410aa-8751-244a-02f9-0000000001ed 12372 1727204076.96449: variable 'ansible_search_path' from source: unknown 12372 1727204076.96452: variable 'ansible_search_path' from source: unknown 12372 1727204076.96483: calling self._execute() 12372 1727204076.96586: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204076.96595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204076.96599: variable 'omit' from source: magic vars 12372 1727204076.97194: variable 'ansible_distribution' from source: facts 12372 1727204076.97198: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12372 1727204076.97201: when evaluation is False, skipping this task 12372 1727204076.97203: _execute() done 12372 1727204076.97207: dumping result to json 12372 1727204076.97209: done dumping result, returning 12372 1727204076.97212: done running TaskExecutor() for managed-node3/TASK: Create EPEL 39 [12b410aa-8751-244a-02f9-0000000001ed] 12372 1727204076.97214: sending task result for task 12b410aa-8751-244a-02f9-0000000001ed 12372 1727204076.97464: done sending task result for task 12b410aa-8751-244a-02f9-0000000001ed 12372 1727204076.97467: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12372 1727204076.97525: no more pending results, returning what we have 12372 1727204076.97528: results queue empty 12372 1727204076.97530: checking for any_errors_fatal 12372 1727204076.97531: done checking for any_errors_fatal 12372 1727204076.97532: checking for max_fail_percentage 12372 1727204076.97534: done checking for max_fail_percentage 12372 1727204076.97535: checking to see if all hosts have failed and the running result is not ok 12372 1727204076.97536: done checking to see if all hosts have failed 12372 1727204076.97537: getting the remaining hosts for this loop 12372 1727204076.97538: done getting the remaining hosts for this loop 12372 1727204076.97542: getting the next task for host managed-node3 12372 1727204076.97549: done getting next task for host managed-node3 12372 1727204076.97552: ^ task is: TASK: Install yum-utils package 12372 1727204076.97557: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204076.97560: getting variables 12372 1727204076.97562: in VariableManager get_vars() 12372 1727204076.97600: Calling all_inventory to load vars for managed-node3 12372 1727204076.97604: Calling groups_inventory to load vars for managed-node3 12372 1727204076.97609: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204076.97621: Calling all_plugins_play to load vars for managed-node3 12372 1727204076.97625: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204076.97629: Calling groups_plugins_play to load vars for managed-node3 12372 1727204076.97970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204076.98156: done with get_vars() 12372 1727204076.98163: done getting variables 12372 1727204076.98246: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.024) 0:00:03.968 ***** 12372 1727204076.98271: entering _queue_task() for managed-node3/package 12372 1727204076.98272: Creating lock for package 12372 1727204076.98481: worker is 1 (out of 1 available) 12372 1727204076.98502: exiting _queue_task() for managed-node3/package 12372 1727204076.98519: done queuing things up, now waiting for results queue to drain 12372 1727204076.98523: waiting for pending results... 12372 1727204076.98858: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 12372 1727204076.98929: in run() - task 12b410aa-8751-244a-02f9-0000000001ee 12372 1727204076.98945: variable 'ansible_search_path' from source: unknown 12372 1727204076.98948: variable 'ansible_search_path' from source: unknown 12372 1727204076.98974: calling self._execute() 12372 1727204076.99034: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204076.99043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204076.99057: variable 'omit' from source: magic vars 12372 1727204076.99554: variable 'ansible_distribution' from source: facts 12372 1727204076.99558: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12372 1727204076.99562: when evaluation is False, skipping this task 12372 1727204076.99564: _execute() done 12372 1727204076.99566: dumping result to json 12372 1727204076.99569: done dumping result, returning 12372 1727204076.99571: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [12b410aa-8751-244a-02f9-0000000001ee] 12372 1727204076.99574: sending task result for task 12b410aa-8751-244a-02f9-0000000001ee skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12372 1727204076.99923: no more pending results, returning what we have 12372 1727204076.99928: results queue empty 12372 1727204076.99929: checking for any_errors_fatal 12372 1727204076.99937: done checking for any_errors_fatal 12372 1727204076.99939: checking for max_fail_percentage 12372 1727204076.99940: done checking for max_fail_percentage 12372 1727204076.99943: checking to see if all hosts have failed and the running result is not ok 12372 1727204076.99944: done checking to see if all hosts have failed 12372 1727204076.99945: getting the remaining hosts for this loop 12372 1727204076.99946: done getting the remaining hosts for this loop 12372 1727204076.99954: getting the next task for host managed-node3 12372 1727204076.99960: done getting next task for host managed-node3 12372 1727204076.99962: ^ task is: TASK: Enable EPEL 7 12372 1727204076.99968: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204076.99974: getting variables 12372 1727204076.99976: in VariableManager get_vars() 12372 1727204077.00015: Calling all_inventory to load vars for managed-node3 12372 1727204077.00022: Calling groups_inventory to load vars for managed-node3 12372 1727204077.00028: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.00041: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.00047: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.00054: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.00287: done sending task result for task 12b410aa-8751-244a-02f9-0000000001ee 12372 1727204077.00293: WORKER PROCESS EXITING 12372 1727204077.00322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.00639: done with get_vars() 12372 1727204077.00650: done getting variables 12372 1727204077.00747: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.025) 0:00:03.993 ***** 12372 1727204077.00796: entering _queue_task() for managed-node3/command 12372 1727204077.01195: worker is 1 (out of 1 available) 12372 1727204077.01289: exiting _queue_task() for managed-node3/command 12372 1727204077.01305: done queuing things up, now waiting for results queue to drain 12372 1727204077.01307: waiting for pending results... 12372 1727204077.01509: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 12372 1727204077.01767: in run() - task 12b410aa-8751-244a-02f9-0000000001ef 12372 1727204077.01788: variable 'ansible_search_path' from source: unknown 12372 1727204077.01800: variable 'ansible_search_path' from source: unknown 12372 1727204077.01936: calling self._execute() 12372 1727204077.02215: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.02231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.02395: variable 'omit' from source: magic vars 12372 1727204077.02737: variable 'ansible_distribution' from source: facts 12372 1727204077.02759: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12372 1727204077.02768: when evaluation is False, skipping this task 12372 1727204077.02776: _execute() done 12372 1727204077.02784: dumping result to json 12372 1727204077.02796: done dumping result, returning 12372 1727204077.02809: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [12b410aa-8751-244a-02f9-0000000001ef] 12372 1727204077.02829: sending task result for task 12b410aa-8751-244a-02f9-0000000001ef skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12372 1727204077.03011: no more pending results, returning what we have 12372 1727204077.03015: results queue empty 12372 1727204077.03016: checking for any_errors_fatal 12372 1727204077.03025: done checking for any_errors_fatal 12372 1727204077.03026: checking for max_fail_percentage 12372 1727204077.03028: done checking for max_fail_percentage 12372 1727204077.03029: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.03030: done checking to see if all hosts have failed 12372 1727204077.03031: getting the remaining hosts for this loop 12372 1727204077.03032: done getting the remaining hosts for this loop 12372 1727204077.03037: getting the next task for host managed-node3 12372 1727204077.03045: done getting next task for host managed-node3 12372 1727204077.03048: ^ task is: TASK: Enable EPEL 8 12372 1727204077.03054: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.03057: getting variables 12372 1727204077.03059: in VariableManager get_vars() 12372 1727204077.03100: Calling all_inventory to load vars for managed-node3 12372 1727204077.03105: Calling groups_inventory to load vars for managed-node3 12372 1727204077.03109: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.03205: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.03210: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.03214: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.03609: done sending task result for task 12b410aa-8751-244a-02f9-0000000001ef 12372 1727204077.03612: WORKER PROCESS EXITING 12372 1727204077.03640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.03903: done with get_vars() 12372 1727204077.03911: done getting variables 12372 1727204077.03958: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.031) 0:00:04.025 ***** 12372 1727204077.03981: entering _queue_task() for managed-node3/command 12372 1727204077.04179: worker is 1 (out of 1 available) 12372 1727204077.04194: exiting _queue_task() for managed-node3/command 12372 1727204077.04207: done queuing things up, now waiting for results queue to drain 12372 1727204077.04210: waiting for pending results... 12372 1727204077.04373: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 12372 1727204077.04462: in run() - task 12b410aa-8751-244a-02f9-0000000001f0 12372 1727204077.04473: variable 'ansible_search_path' from source: unknown 12372 1727204077.04477: variable 'ansible_search_path' from source: unknown 12372 1727204077.04511: calling self._execute() 12372 1727204077.04575: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.04582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.04592: variable 'omit' from source: magic vars 12372 1727204077.05084: variable 'ansible_distribution' from source: facts 12372 1727204077.05297: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12372 1727204077.05300: when evaluation is False, skipping this task 12372 1727204077.05303: _execute() done 12372 1727204077.05305: dumping result to json 12372 1727204077.05307: done dumping result, returning 12372 1727204077.05310: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [12b410aa-8751-244a-02f9-0000000001f0] 12372 1727204077.05312: sending task result for task 12b410aa-8751-244a-02f9-0000000001f0 12372 1727204077.05379: done sending task result for task 12b410aa-8751-244a-02f9-0000000001f0 12372 1727204077.05383: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12372 1727204077.05423: no more pending results, returning what we have 12372 1727204077.05426: results queue empty 12372 1727204077.05428: checking for any_errors_fatal 12372 1727204077.05431: done checking for any_errors_fatal 12372 1727204077.05432: checking for max_fail_percentage 12372 1727204077.05433: done checking for max_fail_percentage 12372 1727204077.05434: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.05435: done checking to see if all hosts have failed 12372 1727204077.05436: getting the remaining hosts for this loop 12372 1727204077.05437: done getting the remaining hosts for this loop 12372 1727204077.05441: getting the next task for host managed-node3 12372 1727204077.05456: done getting next task for host managed-node3 12372 1727204077.05464: ^ task is: TASK: Enable EPEL 6 12372 1727204077.05468: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.05471: getting variables 12372 1727204077.05473: in VariableManager get_vars() 12372 1727204077.05500: Calling all_inventory to load vars for managed-node3 12372 1727204077.05502: Calling groups_inventory to load vars for managed-node3 12372 1727204077.05506: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.05514: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.05517: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.05520: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.05759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.06061: done with get_vars() 12372 1727204077.06072: done getting variables 12372 1727204077.06143: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.021) 0:00:04.047 ***** 12372 1727204077.06178: entering _queue_task() for managed-node3/copy 12372 1727204077.06662: worker is 1 (out of 1 available) 12372 1727204077.06673: exiting _queue_task() for managed-node3/copy 12372 1727204077.06683: done queuing things up, now waiting for results queue to drain 12372 1727204077.06685: waiting for pending results... 12372 1727204077.06958: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 12372 1727204077.07695: in run() - task 12b410aa-8751-244a-02f9-0000000001f2 12372 1727204077.07699: variable 'ansible_search_path' from source: unknown 12372 1727204077.07702: variable 'ansible_search_path' from source: unknown 12372 1727204077.07705: calling self._execute() 12372 1727204077.07707: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.07710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.07714: variable 'omit' from source: magic vars 12372 1727204077.08418: variable 'ansible_distribution' from source: facts 12372 1727204077.08439: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12372 1727204077.08447: when evaluation is False, skipping this task 12372 1727204077.08455: _execute() done 12372 1727204077.08462: dumping result to json 12372 1727204077.08470: done dumping result, returning 12372 1727204077.08481: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [12b410aa-8751-244a-02f9-0000000001f2] 12372 1727204077.08496: sending task result for task 12b410aa-8751-244a-02f9-0000000001f2 12372 1727204077.08620: done sending task result for task 12b410aa-8751-244a-02f9-0000000001f2 12372 1727204077.08630: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12372 1727204077.08688: no more pending results, returning what we have 12372 1727204077.08694: results queue empty 12372 1727204077.08695: checking for any_errors_fatal 12372 1727204077.08702: done checking for any_errors_fatal 12372 1727204077.08703: checking for max_fail_percentage 12372 1727204077.08705: done checking for max_fail_percentage 12372 1727204077.08706: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.08707: done checking to see if all hosts have failed 12372 1727204077.08707: getting the remaining hosts for this loop 12372 1727204077.08709: done getting the remaining hosts for this loop 12372 1727204077.08714: getting the next task for host managed-node3 12372 1727204077.08743: done getting next task for host managed-node3 12372 1727204077.08746: ^ task is: TASK: Set network provider to 'initscripts' 12372 1727204077.08750: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.08753: getting variables 12372 1727204077.08755: in VariableManager get_vars() 12372 1727204077.08782: Calling all_inventory to load vars for managed-node3 12372 1727204077.08785: Calling groups_inventory to load vars for managed-node3 12372 1727204077.08788: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.08800: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.08803: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.08806: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.09062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.09343: done with get_vars() 12372 1727204077.09355: done getting variables 12372 1727204077.09456: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'initscripts'] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:12 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.033) 0:00:04.080 ***** 12372 1727204077.09486: entering _queue_task() for managed-node3/set_fact 12372 1727204077.09769: worker is 1 (out of 1 available) 12372 1727204077.09784: exiting _queue_task() for managed-node3/set_fact 12372 1727204077.10006: done queuing things up, now waiting for results queue to drain 12372 1727204077.10008: waiting for pending results... 12372 1727204077.10120: running TaskExecutor() for managed-node3/TASK: Set network provider to 'initscripts' 12372 1727204077.10240: in run() - task 12b410aa-8751-244a-02f9-000000000007 12372 1727204077.10348: variable 'ansible_search_path' from source: unknown 12372 1727204077.10352: calling self._execute() 12372 1727204077.10401: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.10420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.10439: variable 'omit' from source: magic vars 12372 1727204077.10897: variable 'omit' from source: magic vars 12372 1727204077.10901: variable 'omit' from source: magic vars 12372 1727204077.11007: variable 'omit' from source: magic vars 12372 1727204077.11010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12372 1727204077.11124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12372 1727204077.11155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12372 1727204077.11182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204077.11212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12372 1727204077.11260: variable 'inventory_hostname' from source: host vars for 'managed-node3' 12372 1727204077.11341: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.11351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.11587: Set connection var ansible_connection to ssh 12372 1727204077.11680: Set connection var ansible_timeout to 10 12372 1727204077.11696: Set connection var ansible_module_compression to ZIP_DEFLATED 12372 1727204077.11708: Set connection var ansible_shell_executable to /bin/sh 12372 1727204077.11719: Set connection var ansible_shell_type to sh 12372 1727204077.11736: Set connection var ansible_pipelining to False 12372 1727204077.11799: variable 'ansible_shell_executable' from source: unknown 12372 1727204077.11987: variable 'ansible_connection' from source: unknown 12372 1727204077.11990: variable 'ansible_module_compression' from source: unknown 12372 1727204077.11994: variable 'ansible_shell_type' from source: unknown 12372 1727204077.11996: variable 'ansible_shell_executable' from source: unknown 12372 1727204077.11999: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.12001: variable 'ansible_pipelining' from source: unknown 12372 1727204077.12003: variable 'ansible_timeout' from source: unknown 12372 1727204077.12006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.12208: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12372 1727204077.12231: variable 'omit' from source: magic vars 12372 1727204077.12241: starting attempt loop 12372 1727204077.12249: running the handler 12372 1727204077.12268: handler run complete 12372 1727204077.12285: attempt loop complete, returning result 12372 1727204077.12296: _execute() done 12372 1727204077.12303: dumping result to json 12372 1727204077.12322: done dumping result, returning 12372 1727204077.12341: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'initscripts' [12b410aa-8751-244a-02f9-000000000007] 12372 1727204077.12354: sending task result for task 12b410aa-8751-244a-02f9-000000000007 12372 1727204077.12597: done sending task result for task 12b410aa-8751-244a-02f9-000000000007 12372 1727204077.12600: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "initscripts" }, "changed": false } 12372 1727204077.12667: no more pending results, returning what we have 12372 1727204077.12671: results queue empty 12372 1727204077.12672: checking for any_errors_fatal 12372 1727204077.12679: done checking for any_errors_fatal 12372 1727204077.12680: checking for max_fail_percentage 12372 1727204077.12682: done checking for max_fail_percentage 12372 1727204077.12682: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.12684: done checking to see if all hosts have failed 12372 1727204077.12685: getting the remaining hosts for this loop 12372 1727204077.12686: done getting the remaining hosts for this loop 12372 1727204077.12694: getting the next task for host managed-node3 12372 1727204077.12703: done getting next task for host managed-node3 12372 1727204077.12706: ^ task is: TASK: meta (flush_handlers) 12372 1727204077.12708: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.12713: getting variables 12372 1727204077.12718: in VariableManager get_vars() 12372 1727204077.12754: Calling all_inventory to load vars for managed-node3 12372 1727204077.12758: Calling groups_inventory to load vars for managed-node3 12372 1727204077.12763: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.12776: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.12780: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.12785: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.13158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.13437: done with get_vars() 12372 1727204077.13451: done getting variables 12372 1727204077.13536: in VariableManager get_vars() 12372 1727204077.13548: Calling all_inventory to load vars for managed-node3 12372 1727204077.13551: Calling groups_inventory to load vars for managed-node3 12372 1727204077.13554: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.13559: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.13562: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.13566: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.13784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.13950: done with get_vars() 12372 1727204077.13962: done queuing things up, now waiting for results queue to drain 12372 1727204077.13963: results queue empty 12372 1727204077.13964: checking for any_errors_fatal 12372 1727204077.13965: done checking for any_errors_fatal 12372 1727204077.13966: checking for max_fail_percentage 12372 1727204077.13967: done checking for max_fail_percentage 12372 1727204077.13967: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.13968: done checking to see if all hosts have failed 12372 1727204077.13968: getting the remaining hosts for this loop 12372 1727204077.13969: done getting the remaining hosts for this loop 12372 1727204077.13971: getting the next task for host managed-node3 12372 1727204077.13974: done getting next task for host managed-node3 12372 1727204077.13975: ^ task is: TASK: meta (flush_handlers) 12372 1727204077.13977: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.13983: getting variables 12372 1727204077.13984: in VariableManager get_vars() 12372 1727204077.13991: Calling all_inventory to load vars for managed-node3 12372 1727204077.13993: Calling groups_inventory to load vars for managed-node3 12372 1727204077.13995: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.13998: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.14000: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.14002: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.14117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.14270: done with get_vars() 12372 1727204077.14276: done getting variables 12372 1727204077.14313: in VariableManager get_vars() 12372 1727204077.14322: Calling all_inventory to load vars for managed-node3 12372 1727204077.14324: Calling groups_inventory to load vars for managed-node3 12372 1727204077.14325: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.14329: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.14331: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.14333: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.14462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.14614: done with get_vars() 12372 1727204077.14625: done queuing things up, now waiting for results queue to drain 12372 1727204077.14626: results queue empty 12372 1727204077.14627: checking for any_errors_fatal 12372 1727204077.14628: done checking for any_errors_fatal 12372 1727204077.14628: checking for max_fail_percentage 12372 1727204077.14629: done checking for max_fail_percentage 12372 1727204077.14630: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.14630: done checking to see if all hosts have failed 12372 1727204077.14631: getting the remaining hosts for this loop 12372 1727204077.14632: done getting the remaining hosts for this loop 12372 1727204077.14633: getting the next task for host managed-node3 12372 1727204077.14635: done getting next task for host managed-node3 12372 1727204077.14636: ^ task is: None 12372 1727204077.14637: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.14638: done queuing things up, now waiting for results queue to drain 12372 1727204077.14639: results queue empty 12372 1727204077.14639: checking for any_errors_fatal 12372 1727204077.14640: done checking for any_errors_fatal 12372 1727204077.14640: checking for max_fail_percentage 12372 1727204077.14641: done checking for max_fail_percentage 12372 1727204077.14641: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.14642: done checking to see if all hosts have failed 12372 1727204077.14643: getting the next task for host managed-node3 12372 1727204077.14645: done getting next task for host managed-node3 12372 1727204077.14646: ^ task is: None 12372 1727204077.14647: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.14692: in VariableManager get_vars() 12372 1727204077.14725: done with get_vars() 12372 1727204077.14730: in VariableManager get_vars() 12372 1727204077.14748: done with get_vars() 12372 1727204077.14751: variable 'omit' from source: magic vars 12372 1727204077.14776: in VariableManager get_vars() 12372 1727204077.14799: done with get_vars() 12372 1727204077.14820: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 12372 1727204077.15914: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12372 1727204077.15962: getting the remaining hosts for this loop 12372 1727204077.15963: done getting the remaining hosts for this loop 12372 1727204077.15966: getting the next task for host managed-node3 12372 1727204077.15970: done getting next task for host managed-node3 12372 1727204077.15972: ^ task is: TASK: Gathering Facts 12372 1727204077.15973: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.15976: getting variables 12372 1727204077.15977: in VariableManager get_vars() 12372 1727204077.16004: Calling all_inventory to load vars for managed-node3 12372 1727204077.16007: Calling groups_inventory to load vars for managed-node3 12372 1727204077.16017: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.16023: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.16039: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.16043: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.16244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.16531: done with get_vars() 12372 1727204077.16540: done getting variables 12372 1727204077.16599: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.071) 0:00:04.151 ***** 12372 1727204077.16627: entering _queue_task() for managed-node3/gather_facts 12372 1727204077.16872: worker is 1 (out of 1 available) 12372 1727204077.16885: exiting _queue_task() for managed-node3/gather_facts 12372 1727204077.16900: done queuing things up, now waiting for results queue to drain 12372 1727204077.16902: waiting for pending results... 12372 1727204077.17069: running TaskExecutor() for managed-node3/TASK: Gathering Facts 12372 1727204077.17152: in run() - task 12b410aa-8751-244a-02f9-000000000218 12372 1727204077.17165: variable 'ansible_search_path' from source: unknown 12372 1727204077.17198: calling self._execute() 12372 1727204077.17270: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.17278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.17288: variable 'omit' from source: magic vars 12372 1727204077.17672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.19796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.19994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.19999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.20002: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.20005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.20106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.20153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.20193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.20254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.20280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.20448: variable 'ansible_distribution' from source: facts 12372 1727204077.20460: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.20478: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.20488: when evaluation is False, skipping this task 12372 1727204077.20498: _execute() done 12372 1727204077.20506: dumping result to json 12372 1727204077.20518: done dumping result, returning 12372 1727204077.20532: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-244a-02f9-000000000218] 12372 1727204077.20544: sending task result for task 12b410aa-8751-244a-02f9-000000000218 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204077.20705: no more pending results, returning what we have 12372 1727204077.20709: results queue empty 12372 1727204077.20710: checking for any_errors_fatal 12372 1727204077.20711: done checking for any_errors_fatal 12372 1727204077.20712: checking for max_fail_percentage 12372 1727204077.20714: done checking for max_fail_percentage 12372 1727204077.20714: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.20715: done checking to see if all hosts have failed 12372 1727204077.20716: getting the remaining hosts for this loop 12372 1727204077.20718: done getting the remaining hosts for this loop 12372 1727204077.20722: getting the next task for host managed-node3 12372 1727204077.20729: done getting next task for host managed-node3 12372 1727204077.20732: ^ task is: TASK: meta (flush_handlers) 12372 1727204077.20734: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.20738: getting variables 12372 1727204077.20740: in VariableManager get_vars() 12372 1727204077.20802: Calling all_inventory to load vars for managed-node3 12372 1727204077.20806: Calling groups_inventory to load vars for managed-node3 12372 1727204077.20809: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.20820: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.20823: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.20827: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.21165: done sending task result for task 12b410aa-8751-244a-02f9-000000000218 12372 1727204077.21168: WORKER PROCESS EXITING 12372 1727204077.21198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.21504: done with get_vars() 12372 1727204077.21520: done getting variables 12372 1727204077.21603: in VariableManager get_vars() 12372 1727204077.21630: Calling all_inventory to load vars for managed-node3 12372 1727204077.21633: Calling groups_inventory to load vars for managed-node3 12372 1727204077.21636: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.21642: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.21646: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.21649: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.21842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.22125: done with get_vars() 12372 1727204077.22141: done queuing things up, now waiting for results queue to drain 12372 1727204077.22143: results queue empty 12372 1727204077.22144: checking for any_errors_fatal 12372 1727204077.22146: done checking for any_errors_fatal 12372 1727204077.22147: checking for max_fail_percentage 12372 1727204077.22148: done checking for max_fail_percentage 12372 1727204077.22149: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.22150: done checking to see if all hosts have failed 12372 1727204077.22151: getting the remaining hosts for this loop 12372 1727204077.22152: done getting the remaining hosts for this loop 12372 1727204077.22155: getting the next task for host managed-node3 12372 1727204077.22159: done getting next task for host managed-node3 12372 1727204077.22161: ^ task is: TASK: INIT Prepare setup 12372 1727204077.22163: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.22166: getting variables 12372 1727204077.22167: in VariableManager get_vars() 12372 1727204077.22191: Calling all_inventory to load vars for managed-node3 12372 1727204077.22194: Calling groups_inventory to load vars for managed-node3 12372 1727204077.22196: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.22202: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.22210: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.22214: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.22408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.22724: done with get_vars() 12372 1727204077.22734: done getting variables 12372 1727204077.22827: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.062) 0:00:04.214 ***** 12372 1727204077.22859: entering _queue_task() for managed-node3/debug 12372 1727204077.22861: Creating lock for debug 12372 1727204077.23184: worker is 1 (out of 1 available) 12372 1727204077.23400: exiting _queue_task() for managed-node3/debug 12372 1727204077.23410: done queuing things up, now waiting for results queue to drain 12372 1727204077.23412: waiting for pending results... 12372 1727204077.23487: running TaskExecutor() for managed-node3/TASK: INIT Prepare setup 12372 1727204077.23614: in run() - task 12b410aa-8751-244a-02f9-00000000000b 12372 1727204077.23641: variable 'ansible_search_path' from source: unknown 12372 1727204077.23683: calling self._execute() 12372 1727204077.23786: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.23802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.23821: variable 'omit' from source: magic vars 12372 1727204077.24373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.27104: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.27210: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.27264: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.27314: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.27359: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.27462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.27505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.27549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.27612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.27639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.27811: variable 'ansible_distribution' from source: facts 12372 1727204077.27828: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.27847: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.27856: when evaluation is False, skipping this task 12372 1727204077.27865: _execute() done 12372 1727204077.27874: dumping result to json 12372 1727204077.27892: done dumping result, returning 12372 1727204077.27906: done running TaskExecutor() for managed-node3/TASK: INIT Prepare setup [12b410aa-8751-244a-02f9-00000000000b] 12372 1727204077.27994: sending task result for task 12b410aa-8751-244a-02f9-00000000000b 12372 1727204077.28071: done sending task result for task 12b410aa-8751-244a-02f9-00000000000b 12372 1727204077.28075: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204077.28159: no more pending results, returning what we have 12372 1727204077.28163: results queue empty 12372 1727204077.28165: checking for any_errors_fatal 12372 1727204077.28167: done checking for any_errors_fatal 12372 1727204077.28168: checking for max_fail_percentage 12372 1727204077.28170: done checking for max_fail_percentage 12372 1727204077.28171: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.28173: done checking to see if all hosts have failed 12372 1727204077.28174: getting the remaining hosts for this loop 12372 1727204077.28175: done getting the remaining hosts for this loop 12372 1727204077.28181: getting the next task for host managed-node3 12372 1727204077.28192: done getting next task for host managed-node3 12372 1727204077.28196: ^ task is: TASK: Install dnsmasq 12372 1727204077.28200: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.28204: getting variables 12372 1727204077.28207: in VariableManager get_vars() 12372 1727204077.28271: Calling all_inventory to load vars for managed-node3 12372 1727204077.28275: Calling groups_inventory to load vars for managed-node3 12372 1727204077.28278: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.28496: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.28501: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.28505: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.28752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.29044: done with get_vars() 12372 1727204077.29056: done getting variables 12372 1727204077.29124: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.063) 0:00:04.277 ***** 12372 1727204077.29163: entering _queue_task() for managed-node3/package 12372 1727204077.29426: worker is 1 (out of 1 available) 12372 1727204077.29438: exiting _queue_task() for managed-node3/package 12372 1727204077.29449: done queuing things up, now waiting for results queue to drain 12372 1727204077.29451: waiting for pending results... 12372 1727204077.29808: running TaskExecutor() for managed-node3/TASK: Install dnsmasq 12372 1727204077.29867: in run() - task 12b410aa-8751-244a-02f9-00000000000f 12372 1727204077.29887: variable 'ansible_search_path' from source: unknown 12372 1727204077.29900: variable 'ansible_search_path' from source: unknown 12372 1727204077.29951: calling self._execute() 12372 1727204077.30051: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.30065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.30081: variable 'omit' from source: magic vars 12372 1727204077.30683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.34033: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.34229: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.34336: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.34483: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.34487: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.34603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.34648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.34706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.34767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.34908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.34968: variable 'ansible_distribution' from source: facts 12372 1727204077.34982: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.35002: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.35013: when evaluation is False, skipping this task 12372 1727204077.35026: _execute() done 12372 1727204077.35034: dumping result to json 12372 1727204077.35043: done dumping result, returning 12372 1727204077.35053: done running TaskExecutor() for managed-node3/TASK: Install dnsmasq [12b410aa-8751-244a-02f9-00000000000f] 12372 1727204077.35063: sending task result for task 12b410aa-8751-244a-02f9-00000000000f skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204077.35508: no more pending results, returning what we have 12372 1727204077.35511: results queue empty 12372 1727204077.35512: checking for any_errors_fatal 12372 1727204077.35521: done checking for any_errors_fatal 12372 1727204077.35522: checking for max_fail_percentage 12372 1727204077.35525: done checking for max_fail_percentage 12372 1727204077.35526: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.35527: done checking to see if all hosts have failed 12372 1727204077.35528: getting the remaining hosts for this loop 12372 1727204077.35529: done getting the remaining hosts for this loop 12372 1727204077.35533: getting the next task for host managed-node3 12372 1727204077.35539: done getting next task for host managed-node3 12372 1727204077.35542: ^ task is: TASK: Install pgrep, sysctl 12372 1727204077.35545: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.35550: getting variables 12372 1727204077.35551: in VariableManager get_vars() 12372 1727204077.35607: Calling all_inventory to load vars for managed-node3 12372 1727204077.35611: Calling groups_inventory to load vars for managed-node3 12372 1727204077.35613: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.35626: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.35630: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.35634: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.35894: done sending task result for task 12b410aa-8751-244a-02f9-00000000000f 12372 1727204077.35898: WORKER PROCESS EXITING 12372 1727204077.35929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.36231: done with get_vars() 12372 1727204077.36244: done getting variables 12372 1727204077.36319: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.071) 0:00:04.349 ***** 12372 1727204077.36357: entering _queue_task() for managed-node3/package 12372 1727204077.36861: worker is 1 (out of 1 available) 12372 1727204077.36873: exiting _queue_task() for managed-node3/package 12372 1727204077.36887: done queuing things up, now waiting for results queue to drain 12372 1727204077.37192: waiting for pending results... 12372 1727204077.37361: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 12372 1727204077.37668: in run() - task 12b410aa-8751-244a-02f9-000000000010 12372 1727204077.37761: variable 'ansible_search_path' from source: unknown 12372 1727204077.37769: variable 'ansible_search_path' from source: unknown 12372 1727204077.37814: calling self._execute() 12372 1727204077.37996: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.38218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.38222: variable 'omit' from source: magic vars 12372 1727204077.38827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.41718: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.41805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.41856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.41904: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.41946: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.42050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.42092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.42132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.42196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.42251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.42396: variable 'ansible_distribution' from source: facts 12372 1727204077.42408: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.42427: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.42435: when evaluation is False, skipping this task 12372 1727204077.42442: _execute() done 12372 1727204077.42468: dumping result to json 12372 1727204077.42471: done dumping result, returning 12372 1727204077.42474: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [12b410aa-8751-244a-02f9-000000000010] 12372 1727204077.42482: sending task result for task 12b410aa-8751-244a-02f9-000000000010 12372 1727204077.42918: done sending task result for task 12b410aa-8751-244a-02f9-000000000010 12372 1727204077.42921: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204077.42963: no more pending results, returning what we have 12372 1727204077.42967: results queue empty 12372 1727204077.42968: checking for any_errors_fatal 12372 1727204077.42972: done checking for any_errors_fatal 12372 1727204077.42973: checking for max_fail_percentage 12372 1727204077.42975: done checking for max_fail_percentage 12372 1727204077.42976: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.42978: done checking to see if all hosts have failed 12372 1727204077.42979: getting the remaining hosts for this loop 12372 1727204077.42980: done getting the remaining hosts for this loop 12372 1727204077.42984: getting the next task for host managed-node3 12372 1727204077.42991: done getting next task for host managed-node3 12372 1727204077.42994: ^ task is: TASK: Install pgrep, sysctl 12372 1727204077.42997: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.43001: getting variables 12372 1727204077.43002: in VariableManager get_vars() 12372 1727204077.43055: Calling all_inventory to load vars for managed-node3 12372 1727204077.43058: Calling groups_inventory to load vars for managed-node3 12372 1727204077.43061: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.43071: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.43075: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.43078: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.43664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.44414: done with get_vars() 12372 1727204077.44430: done getting variables 12372 1727204077.44704: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.083) 0:00:04.433 ***** 12372 1727204077.44742: entering _queue_task() for managed-node3/package 12372 1727204077.45222: worker is 1 (out of 1 available) 12372 1727204077.45234: exiting _queue_task() for managed-node3/package 12372 1727204077.45244: done queuing things up, now waiting for results queue to drain 12372 1727204077.45246: waiting for pending results... 12372 1727204077.45466: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 12372 1727204077.45997: in run() - task 12b410aa-8751-244a-02f9-000000000011 12372 1727204077.46001: variable 'ansible_search_path' from source: unknown 12372 1727204077.46004: variable 'ansible_search_path' from source: unknown 12372 1727204077.46007: calling self._execute() 12372 1727204077.46025: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.46039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.46057: variable 'omit' from source: magic vars 12372 1727204077.47281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.50869: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.51002: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.51070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.51121: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.51169: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.51271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.51316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.51353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.51414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.51438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.51612: variable 'ansible_distribution' from source: facts 12372 1727204077.51626: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.51645: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.51654: when evaluation is False, skipping this task 12372 1727204077.51661: _execute() done 12372 1727204077.51669: dumping result to json 12372 1727204077.51678: done dumping result, returning 12372 1727204077.51695: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [12b410aa-8751-244a-02f9-000000000011] 12372 1727204077.51707: sending task result for task 12b410aa-8751-244a-02f9-000000000011 12372 1727204077.51907: done sending task result for task 12b410aa-8751-244a-02f9-000000000011 12372 1727204077.51911: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204077.51962: no more pending results, returning what we have 12372 1727204077.51966: results queue empty 12372 1727204077.51967: checking for any_errors_fatal 12372 1727204077.51975: done checking for any_errors_fatal 12372 1727204077.51976: checking for max_fail_percentage 12372 1727204077.51977: done checking for max_fail_percentage 12372 1727204077.51978: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.51980: done checking to see if all hosts have failed 12372 1727204077.51980: getting the remaining hosts for this loop 12372 1727204077.51982: done getting the remaining hosts for this loop 12372 1727204077.51986: getting the next task for host managed-node3 12372 1727204077.51995: done getting next task for host managed-node3 12372 1727204077.51998: ^ task is: TASK: Create test interfaces 12372 1727204077.52001: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.52006: getting variables 12372 1727204077.52008: in VariableManager get_vars() 12372 1727204077.52069: Calling all_inventory to load vars for managed-node3 12372 1727204077.52073: Calling groups_inventory to load vars for managed-node3 12372 1727204077.52075: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.52087: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.52295: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.52301: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.52595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.52875: done with get_vars() 12372 1727204077.52892: done getting variables 12372 1727204077.52999: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.082) 0:00:04.516 ***** 12372 1727204077.53034: entering _queue_task() for managed-node3/shell 12372 1727204077.53036: Creating lock for shell 12372 1727204077.53411: worker is 1 (out of 1 available) 12372 1727204077.53423: exiting _queue_task() for managed-node3/shell 12372 1727204077.53436: done queuing things up, now waiting for results queue to drain 12372 1727204077.53438: waiting for pending results... 12372 1727204077.53819: running TaskExecutor() for managed-node3/TASK: Create test interfaces 12372 1727204077.53953: in run() - task 12b410aa-8751-244a-02f9-000000000012 12372 1727204077.54008: variable 'ansible_search_path' from source: unknown 12372 1727204077.54037: variable 'ansible_search_path' from source: unknown 12372 1727204077.54132: calling self._execute() 12372 1727204077.54249: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.54254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.54272: variable 'omit' from source: magic vars 12372 1727204077.54975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.58513: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.58602: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.58653: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.58701: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.58740: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.58848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.58880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.58956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.58978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.59003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.59170: variable 'ansible_distribution' from source: facts 12372 1727204077.59185: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.59204: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.59212: when evaluation is False, skipping this task 12372 1727204077.59287: _execute() done 12372 1727204077.59290: dumping result to json 12372 1727204077.59294: done dumping result, returning 12372 1727204077.59297: done running TaskExecutor() for managed-node3/TASK: Create test interfaces [12b410aa-8751-244a-02f9-000000000012] 12372 1727204077.59299: sending task result for task 12b410aa-8751-244a-02f9-000000000012 12372 1727204077.59370: done sending task result for task 12b410aa-8751-244a-02f9-000000000012 12372 1727204077.59373: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204077.59441: no more pending results, returning what we have 12372 1727204077.59445: results queue empty 12372 1727204077.59446: checking for any_errors_fatal 12372 1727204077.59453: done checking for any_errors_fatal 12372 1727204077.59454: checking for max_fail_percentage 12372 1727204077.59456: done checking for max_fail_percentage 12372 1727204077.59457: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.59458: done checking to see if all hosts have failed 12372 1727204077.59459: getting the remaining hosts for this loop 12372 1727204077.59461: done getting the remaining hosts for this loop 12372 1727204077.59465: getting the next task for host managed-node3 12372 1727204077.59477: done getting next task for host managed-node3 12372 1727204077.59481: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12372 1727204077.59484: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.59488: getting variables 12372 1727204077.59492: in VariableManager get_vars() 12372 1727204077.59551: Calling all_inventory to load vars for managed-node3 12372 1727204077.59555: Calling groups_inventory to load vars for managed-node3 12372 1727204077.59558: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.59570: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.59573: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.59577: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.60051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.60359: done with get_vars() 12372 1727204077.60371: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.074) 0:00:04.590 ***** 12372 1727204077.60479: entering _queue_task() for managed-node3/include_tasks 12372 1727204077.60725: worker is 1 (out of 1 available) 12372 1727204077.60739: exiting _queue_task() for managed-node3/include_tasks 12372 1727204077.60752: done queuing things up, now waiting for results queue to drain 12372 1727204077.60754: waiting for pending results... 12372 1727204077.61112: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 12372 1727204077.61200: in run() - task 12b410aa-8751-244a-02f9-000000000016 12372 1727204077.61307: variable 'ansible_search_path' from source: unknown 12372 1727204077.61311: variable 'ansible_search_path' from source: unknown 12372 1727204077.61317: calling self._execute() 12372 1727204077.61626: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.61679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.61747: variable 'omit' from source: magic vars 12372 1727204077.63056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.66963: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.67073: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.67125: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.67182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.67261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.67353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.67405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.67444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.67585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.67591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.67721: variable 'ansible_distribution' from source: facts 12372 1727204077.67733: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.67749: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.67757: when evaluation is False, skipping this task 12372 1727204077.67765: _execute() done 12372 1727204077.67772: dumping result to json 12372 1727204077.67779: done dumping result, returning 12372 1727204077.67794: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-244a-02f9-000000000016] 12372 1727204077.67834: sending task result for task 12b410aa-8751-244a-02f9-000000000016 12372 1727204077.67997: done sending task result for task 12b410aa-8751-244a-02f9-000000000016 12372 1727204077.68003: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204077.68060: no more pending results, returning what we have 12372 1727204077.68064: results queue empty 12372 1727204077.68065: checking for any_errors_fatal 12372 1727204077.68070: done checking for any_errors_fatal 12372 1727204077.68071: checking for max_fail_percentage 12372 1727204077.68073: done checking for max_fail_percentage 12372 1727204077.68074: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.68075: done checking to see if all hosts have failed 12372 1727204077.68076: getting the remaining hosts for this loop 12372 1727204077.68077: done getting the remaining hosts for this loop 12372 1727204077.68084: getting the next task for host managed-node3 12372 1727204077.68096: done getting next task for host managed-node3 12372 1727204077.68099: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12372 1727204077.68102: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.68106: getting variables 12372 1727204077.68108: in VariableManager get_vars() 12372 1727204077.68167: Calling all_inventory to load vars for managed-node3 12372 1727204077.68170: Calling groups_inventory to load vars for managed-node3 12372 1727204077.68175: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.68186: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.68346: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.68380: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.68663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.69041: done with get_vars() 12372 1727204077.69055: done getting variables 12372 1727204077.69231: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 12372 1727204077.69470: variable 'interface' from source: task vars 12372 1727204077.69476: variable 'dhcp_interface1' from source: play vars 12372 1727204077.69591: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.091) 0:00:04.682 ***** 12372 1727204077.69654: entering _queue_task() for managed-node3/assert 12372 1727204077.69657: Creating lock for assert 12372 1727204077.70427: worker is 1 (out of 1 available) 12372 1727204077.70436: exiting _queue_task() for managed-node3/assert 12372 1727204077.70445: done queuing things up, now waiting for results queue to drain 12372 1727204077.70447: waiting for pending results... 12372 1727204077.70899: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' 12372 1727204077.70909: in run() - task 12b410aa-8751-244a-02f9-000000000017 12372 1727204077.70912: variable 'ansible_search_path' from source: unknown 12372 1727204077.70917: variable 'ansible_search_path' from source: unknown 12372 1727204077.71126: calling self._execute() 12372 1727204077.71313: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.71332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.71362: variable 'omit' from source: magic vars 12372 1727204077.72285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.76730: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.76827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.76879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.76947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.76986: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.77098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.77149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.77194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.77294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.77299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.77462: variable 'ansible_distribution' from source: facts 12372 1727204077.77475: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.77496: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.77505: when evaluation is False, skipping this task 12372 1727204077.77559: _execute() done 12372 1727204077.77563: dumping result to json 12372 1727204077.77565: done dumping result, returning 12372 1727204077.77568: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' [12b410aa-8751-244a-02f9-000000000017] 12372 1727204077.77570: sending task result for task 12b410aa-8751-244a-02f9-000000000017 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204077.77830: no more pending results, returning what we have 12372 1727204077.77834: results queue empty 12372 1727204077.77836: checking for any_errors_fatal 12372 1727204077.77841: done checking for any_errors_fatal 12372 1727204077.77842: checking for max_fail_percentage 12372 1727204077.77844: done checking for max_fail_percentage 12372 1727204077.77845: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.77846: done checking to see if all hosts have failed 12372 1727204077.77848: getting the remaining hosts for this loop 12372 1727204077.77849: done getting the remaining hosts for this loop 12372 1727204077.77854: getting the next task for host managed-node3 12372 1727204077.77865: done getting next task for host managed-node3 12372 1727204077.77868: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12372 1727204077.77872: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.77876: getting variables 12372 1727204077.77878: in VariableManager get_vars() 12372 1727204077.77944: Calling all_inventory to load vars for managed-node3 12372 1727204077.77948: Calling groups_inventory to load vars for managed-node3 12372 1727204077.77951: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.77963: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.77966: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.77970: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.78491: done sending task result for task 12b410aa-8751-244a-02f9-000000000017 12372 1727204077.78495: WORKER PROCESS EXITING 12372 1727204077.78525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.79174: done with get_vars() 12372 1727204077.79186: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.098) 0:00:04.780 ***** 12372 1727204077.79500: entering _queue_task() for managed-node3/include_tasks 12372 1727204077.80191: worker is 1 (out of 1 available) 12372 1727204077.80200: exiting _queue_task() for managed-node3/include_tasks 12372 1727204077.80211: done queuing things up, now waiting for results queue to drain 12372 1727204077.80213: waiting for pending results... 12372 1727204077.80524: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 12372 1727204077.80754: in run() - task 12b410aa-8751-244a-02f9-00000000001b 12372 1727204077.80775: variable 'ansible_search_path' from source: unknown 12372 1727204077.80785: variable 'ansible_search_path' from source: unknown 12372 1727204077.80895: calling self._execute() 12372 1727204077.81110: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.81125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.81142: variable 'omit' from source: magic vars 12372 1727204077.82314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204077.87697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204077.87752: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204077.87845: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204077.88053: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204077.88057: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204077.88274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204077.88319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204077.88358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204077.88597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204077.88601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204077.89074: variable 'ansible_distribution' from source: facts 12372 1727204077.89088: variable 'ansible_distribution_major_version' from source: facts 12372 1727204077.89112: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204077.89149: when evaluation is False, skipping this task 12372 1727204077.89160: _execute() done 12372 1727204077.89170: dumping result to json 12372 1727204077.89180: done dumping result, returning 12372 1727204077.89261: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-244a-02f9-00000000001b] 12372 1727204077.89273: sending task result for task 12b410aa-8751-244a-02f9-00000000001b skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204077.89449: no more pending results, returning what we have 12372 1727204077.89453: results queue empty 12372 1727204077.89454: checking for any_errors_fatal 12372 1727204077.89461: done checking for any_errors_fatal 12372 1727204077.89463: checking for max_fail_percentage 12372 1727204077.89464: done checking for max_fail_percentage 12372 1727204077.89465: checking to see if all hosts have failed and the running result is not ok 12372 1727204077.89466: done checking to see if all hosts have failed 12372 1727204077.89467: getting the remaining hosts for this loop 12372 1727204077.89469: done getting the remaining hosts for this loop 12372 1727204077.89474: getting the next task for host managed-node3 12372 1727204077.89482: done getting next task for host managed-node3 12372 1727204077.89486: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12372 1727204077.89492: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204077.89496: getting variables 12372 1727204077.89498: in VariableManager get_vars() 12372 1727204077.89564: Calling all_inventory to load vars for managed-node3 12372 1727204077.89568: Calling groups_inventory to load vars for managed-node3 12372 1727204077.89571: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204077.89583: Calling all_plugins_play to load vars for managed-node3 12372 1727204077.89586: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204077.89899: done sending task result for task 12b410aa-8751-244a-02f9-00000000001b 12372 1727204077.89903: WORKER PROCESS EXITING 12372 1727204077.89909: Calling groups_plugins_play to load vars for managed-node3 12372 1727204077.90532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204077.91651: done with get_vars() 12372 1727204077.91666: done getting variables 12372 1727204077.91740: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12372 1727204077.92484: variable 'interface' from source: task vars 12372 1727204077.92488: variable 'dhcp_interface2' from source: play vars 12372 1727204077.92577: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.131) 0:00:04.911 ***** 12372 1727204077.92620: entering _queue_task() for managed-node3/assert 12372 1727204077.93856: worker is 1 (out of 1 available) 12372 1727204077.93872: exiting _queue_task() for managed-node3/assert 12372 1727204077.93886: done queuing things up, now waiting for results queue to drain 12372 1727204077.93888: waiting for pending results... 12372 1727204077.94694: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' 12372 1727204077.95074: in run() - task 12b410aa-8751-244a-02f9-00000000001c 12372 1727204077.95295: variable 'ansible_search_path' from source: unknown 12372 1727204077.95299: variable 'ansible_search_path' from source: unknown 12372 1727204077.95308: calling self._execute() 12372 1727204077.95636: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204077.95642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204077.95645: variable 'omit' from source: magic vars 12372 1727204077.97053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.04323: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.04665: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.04670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.04793: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.04831: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.05051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.05295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.05299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.05302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.05495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.05799: variable 'ansible_distribution' from source: facts 12372 1727204078.05812: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.05829: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.05837: when evaluation is False, skipping this task 12372 1727204078.05847: _execute() done 12372 1727204078.05865: dumping result to json 12372 1727204078.05875: done dumping result, returning 12372 1727204078.05891: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' [12b410aa-8751-244a-02f9-00000000001c] 12372 1727204078.05974: sending task result for task 12b410aa-8751-244a-02f9-00000000001c 12372 1727204078.06100: done sending task result for task 12b410aa-8751-244a-02f9-00000000001c 12372 1727204078.06109: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204078.06170: no more pending results, returning what we have 12372 1727204078.06175: results queue empty 12372 1727204078.06176: checking for any_errors_fatal 12372 1727204078.06185: done checking for any_errors_fatal 12372 1727204078.06186: checking for max_fail_percentage 12372 1727204078.06188: done checking for max_fail_percentage 12372 1727204078.06190: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.06191: done checking to see if all hosts have failed 12372 1727204078.06192: getting the remaining hosts for this loop 12372 1727204078.06195: done getting the remaining hosts for this loop 12372 1727204078.06201: getting the next task for host managed-node3 12372 1727204078.06210: done getting next task for host managed-node3 12372 1727204078.06213: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 12372 1727204078.06215: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.06219: getting variables 12372 1727204078.06221: in VariableManager get_vars() 12372 1727204078.06286: Calling all_inventory to load vars for managed-node3 12372 1727204078.06597: Calling groups_inventory to load vars for managed-node3 12372 1727204078.06603: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.06617: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.06621: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.06626: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.07364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.07956: done with get_vars() 12372 1727204078.07972: done getting variables 12372 1727204078.08251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.156) 0:00:05.068 ***** 12372 1727204078.08284: entering _queue_task() for managed-node3/command 12372 1727204078.08919: worker is 1 (out of 1 available) 12372 1727204078.08931: exiting _queue_task() for managed-node3/command 12372 1727204078.08943: done queuing things up, now waiting for results queue to drain 12372 1727204078.08945: waiting for pending results... 12372 1727204078.09136: running TaskExecutor() for managed-node3/TASK: Backup the /etc/resolv.conf for initscript 12372 1727204078.09300: in run() - task 12b410aa-8751-244a-02f9-00000000001d 12372 1727204078.09548: variable 'ansible_search_path' from source: unknown 12372 1727204078.09553: calling self._execute() 12372 1727204078.09641: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204078.09874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204078.09877: variable 'omit' from source: magic vars 12372 1727204078.11497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.17956: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.18158: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.18211: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.18496: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.18500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.18581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.18797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.18801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.18852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.18914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.19298: variable 'ansible_distribution' from source: facts 12372 1727204078.19405: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.19426: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.19434: when evaluation is False, skipping this task 12372 1727204078.19442: _execute() done 12372 1727204078.19449: dumping result to json 12372 1727204078.19457: done dumping result, returning 12372 1727204078.19471: done running TaskExecutor() for managed-node3/TASK: Backup the /etc/resolv.conf for initscript [12b410aa-8751-244a-02f9-00000000001d] 12372 1727204078.19798: sending task result for task 12b410aa-8751-244a-02f9-00000000001d 12372 1727204078.19876: done sending task result for task 12b410aa-8751-244a-02f9-00000000001d 12372 1727204078.19880: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204078.19959: no more pending results, returning what we have 12372 1727204078.19963: results queue empty 12372 1727204078.19964: checking for any_errors_fatal 12372 1727204078.19970: done checking for any_errors_fatal 12372 1727204078.19971: checking for max_fail_percentage 12372 1727204078.19972: done checking for max_fail_percentage 12372 1727204078.19973: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.19975: done checking to see if all hosts have failed 12372 1727204078.19976: getting the remaining hosts for this loop 12372 1727204078.19977: done getting the remaining hosts for this loop 12372 1727204078.19982: getting the next task for host managed-node3 12372 1727204078.19992: done getting next task for host managed-node3 12372 1727204078.19995: ^ task is: TASK: TEST Add Bond with 2 ports 12372 1727204078.19997: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.20000: getting variables 12372 1727204078.20003: in VariableManager get_vars() 12372 1727204078.20061: Calling all_inventory to load vars for managed-node3 12372 1727204078.20065: Calling groups_inventory to load vars for managed-node3 12372 1727204078.20067: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.20077: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.20080: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.20084: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.20712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.21683: done with get_vars() 12372 1727204078.21794: done getting variables 12372 1727204078.22031: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.137) 0:00:05.206 ***** 12372 1727204078.22070: entering _queue_task() for managed-node3/debug 12372 1727204078.23075: worker is 1 (out of 1 available) 12372 1727204078.23093: exiting _queue_task() for managed-node3/debug 12372 1727204078.23107: done queuing things up, now waiting for results queue to drain 12372 1727204078.23109: waiting for pending results... 12372 1727204078.23887: running TaskExecutor() for managed-node3/TASK: TEST Add Bond with 2 ports 12372 1727204078.24152: in run() - task 12b410aa-8751-244a-02f9-00000000001e 12372 1727204078.24175: variable 'ansible_search_path' from source: unknown 12372 1727204078.24296: calling self._execute() 12372 1727204078.24653: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204078.24760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204078.24764: variable 'omit' from source: magic vars 12372 1727204078.25682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.29321: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.29535: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.29586: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.29641: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.29735: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.29923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.30037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.30131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.30376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.30380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.30699: variable 'ansible_distribution' from source: facts 12372 1727204078.30702: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.30705: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.30707: when evaluation is False, skipping this task 12372 1727204078.30709: _execute() done 12372 1727204078.30711: dumping result to json 12372 1727204078.30713: done dumping result, returning 12372 1727204078.30715: done running TaskExecutor() for managed-node3/TASK: TEST Add Bond with 2 ports [12b410aa-8751-244a-02f9-00000000001e] 12372 1727204078.30717: sending task result for task 12b410aa-8751-244a-02f9-00000000001e skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204078.30975: no more pending results, returning what we have 12372 1727204078.30980: results queue empty 12372 1727204078.30981: checking for any_errors_fatal 12372 1727204078.30994: done checking for any_errors_fatal 12372 1727204078.30996: checking for max_fail_percentage 12372 1727204078.30997: done checking for max_fail_percentage 12372 1727204078.30998: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.31000: done checking to see if all hosts have failed 12372 1727204078.31000: getting the remaining hosts for this loop 12372 1727204078.31002: done getting the remaining hosts for this loop 12372 1727204078.31007: getting the next task for host managed-node3 12372 1727204078.31016: done getting next task for host managed-node3 12372 1727204078.31032: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204078.31036: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.31203: getting variables 12372 1727204078.31205: in VariableManager get_vars() 12372 1727204078.31263: Calling all_inventory to load vars for managed-node3 12372 1727204078.31267: Calling groups_inventory to load vars for managed-node3 12372 1727204078.31271: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.31281: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.31284: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.31288: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.31309: done sending task result for task 12b410aa-8751-244a-02f9-00000000001e 12372 1727204078.31313: WORKER PROCESS EXITING 12372 1727204078.31664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.31981: done with get_vars() 12372 1727204078.31997: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.100) 0:00:05.306 ***** 12372 1727204078.32120: entering _queue_task() for managed-node3/include_tasks 12372 1727204078.32472: worker is 1 (out of 1 available) 12372 1727204078.32485: exiting _queue_task() for managed-node3/include_tasks 12372 1727204078.32609: done queuing things up, now waiting for results queue to drain 12372 1727204078.32612: waiting for pending results... 12372 1727204078.32810: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204078.32985: in run() - task 12b410aa-8751-244a-02f9-000000000026 12372 1727204078.33012: variable 'ansible_search_path' from source: unknown 12372 1727204078.33021: variable 'ansible_search_path' from source: unknown 12372 1727204078.33079: calling self._execute() 12372 1727204078.33186: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204078.33203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204078.33221: variable 'omit' from source: magic vars 12372 1727204078.33799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.39604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.39788: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.39896: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.39943: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.40031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.40333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.40355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.40437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.40625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.40629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.41096: variable 'ansible_distribution' from source: facts 12372 1727204078.41100: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.41103: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.41106: when evaluation is False, skipping this task 12372 1727204078.41118: _execute() done 12372 1727204078.41127: dumping result to json 12372 1727204078.41184: done dumping result, returning 12372 1727204078.41279: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-244a-02f9-000000000026] 12372 1727204078.41283: sending task result for task 12b410aa-8751-244a-02f9-000000000026 12372 1727204078.41628: done sending task result for task 12b410aa-8751-244a-02f9-000000000026 12372 1727204078.41631: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204078.41687: no more pending results, returning what we have 12372 1727204078.41694: results queue empty 12372 1727204078.41695: checking for any_errors_fatal 12372 1727204078.41717: done checking for any_errors_fatal 12372 1727204078.41718: checking for max_fail_percentage 12372 1727204078.41720: done checking for max_fail_percentage 12372 1727204078.41721: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.41722: done checking to see if all hosts have failed 12372 1727204078.41723: getting the remaining hosts for this loop 12372 1727204078.41725: done getting the remaining hosts for this loop 12372 1727204078.41731: getting the next task for host managed-node3 12372 1727204078.41738: done getting next task for host managed-node3 12372 1727204078.41744: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204078.41747: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.41764: getting variables 12372 1727204078.41767: in VariableManager get_vars() 12372 1727204078.42057: Calling all_inventory to load vars for managed-node3 12372 1727204078.42061: Calling groups_inventory to load vars for managed-node3 12372 1727204078.42065: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.42076: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.42079: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.42084: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.42772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.43524: done with get_vars() 12372 1727204078.43541: done getting variables 12372 1727204078.43726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.116) 0:00:05.423 ***** 12372 1727204078.43763: entering _queue_task() for managed-node3/debug 12372 1727204078.44502: worker is 1 (out of 1 available) 12372 1727204078.44517: exiting _queue_task() for managed-node3/debug 12372 1727204078.44531: done queuing things up, now waiting for results queue to drain 12372 1727204078.44533: waiting for pending results... 12372 1727204078.45511: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204078.46095: in run() - task 12b410aa-8751-244a-02f9-000000000027 12372 1727204078.46100: variable 'ansible_search_path' from source: unknown 12372 1727204078.46102: variable 'ansible_search_path' from source: unknown 12372 1727204078.46106: calling self._execute() 12372 1727204078.46110: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204078.46113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204078.46115: variable 'omit' from source: magic vars 12372 1727204078.47840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.53342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.53786: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.53844: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.53895: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.53938: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.54045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.54093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.54138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.54203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.54233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.54394: variable 'ansible_distribution' from source: facts 12372 1727204078.54407: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.54431: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.54495: when evaluation is False, skipping this task 12372 1727204078.54498: _execute() done 12372 1727204078.54500: dumping result to json 12372 1727204078.54502: done dumping result, returning 12372 1727204078.54504: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-244a-02f9-000000000027] 12372 1727204078.54506: sending task result for task 12b410aa-8751-244a-02f9-000000000027 12372 1727204078.54762: done sending task result for task 12b410aa-8751-244a-02f9-000000000027 12372 1727204078.54765: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204078.54835: no more pending results, returning what we have 12372 1727204078.54839: results queue empty 12372 1727204078.54840: checking for any_errors_fatal 12372 1727204078.54846: done checking for any_errors_fatal 12372 1727204078.54847: checking for max_fail_percentage 12372 1727204078.54851: done checking for max_fail_percentage 12372 1727204078.54853: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.54854: done checking to see if all hosts have failed 12372 1727204078.54855: getting the remaining hosts for this loop 12372 1727204078.54859: done getting the remaining hosts for this loop 12372 1727204078.54864: getting the next task for host managed-node3 12372 1727204078.54872: done getting next task for host managed-node3 12372 1727204078.54876: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204078.54879: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.54897: getting variables 12372 1727204078.54899: in VariableManager get_vars() 12372 1727204078.54962: Calling all_inventory to load vars for managed-node3 12372 1727204078.54966: Calling groups_inventory to load vars for managed-node3 12372 1727204078.54969: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.54982: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.54986: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.55208: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.56064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.56554: done with get_vars() 12372 1727204078.56568: done getting variables 12372 1727204078.56777: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.131) 0:00:05.554 ***** 12372 1727204078.56924: entering _queue_task() for managed-node3/fail 12372 1727204078.56927: Creating lock for fail 12372 1727204078.57596: worker is 1 (out of 1 available) 12372 1727204078.57608: exiting _queue_task() for managed-node3/fail 12372 1727204078.57621: done queuing things up, now waiting for results queue to drain 12372 1727204078.57623: waiting for pending results... 12372 1727204078.57725: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204078.57928: in run() - task 12b410aa-8751-244a-02f9-000000000028 12372 1727204078.57973: variable 'ansible_search_path' from source: unknown 12372 1727204078.57977: variable 'ansible_search_path' from source: unknown 12372 1727204078.58082: calling self._execute() 12372 1727204078.58119: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204078.58149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204078.58166: variable 'omit' from source: magic vars 12372 1727204078.58784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.67572: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.67820: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.67851: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.68143: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.68146: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.68298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.68343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.68386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.68545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.68692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.68978: variable 'ansible_distribution' from source: facts 12372 1727204078.69027: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.69044: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.69124: when evaluation is False, skipping this task 12372 1727204078.69127: _execute() done 12372 1727204078.69130: dumping result to json 12372 1727204078.69132: done dumping result, returning 12372 1727204078.69138: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-244a-02f9-000000000028] 12372 1727204078.69149: sending task result for task 12b410aa-8751-244a-02f9-000000000028 12372 1727204078.69600: done sending task result for task 12b410aa-8751-244a-02f9-000000000028 12372 1727204078.69603: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204078.69657: no more pending results, returning what we have 12372 1727204078.69661: results queue empty 12372 1727204078.69662: checking for any_errors_fatal 12372 1727204078.69670: done checking for any_errors_fatal 12372 1727204078.69671: checking for max_fail_percentage 12372 1727204078.69673: done checking for max_fail_percentage 12372 1727204078.69674: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.69675: done checking to see if all hosts have failed 12372 1727204078.69676: getting the remaining hosts for this loop 12372 1727204078.69678: done getting the remaining hosts for this loop 12372 1727204078.69683: getting the next task for host managed-node3 12372 1727204078.69692: done getting next task for host managed-node3 12372 1727204078.69696: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204078.69700: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.69718: getting variables 12372 1727204078.69720: in VariableManager get_vars() 12372 1727204078.69783: Calling all_inventory to load vars for managed-node3 12372 1727204078.69788: Calling groups_inventory to load vars for managed-node3 12372 1727204078.70195: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.70206: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.70210: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.70214: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.70704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.71400: done with get_vars() 12372 1727204078.71414: done getting variables 12372 1727204078.71482: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.145) 0:00:05.700 ***** 12372 1727204078.71522: entering _queue_task() for managed-node3/fail 12372 1727204078.72429: worker is 1 (out of 1 available) 12372 1727204078.72441: exiting _queue_task() for managed-node3/fail 12372 1727204078.72454: done queuing things up, now waiting for results queue to drain 12372 1727204078.72456: waiting for pending results... 12372 1727204078.72917: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204078.73083: in run() - task 12b410aa-8751-244a-02f9-000000000029 12372 1727204078.73140: variable 'ansible_search_path' from source: unknown 12372 1727204078.73151: variable 'ansible_search_path' from source: unknown 12372 1727204078.73202: calling self._execute() 12372 1727204078.73504: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204078.73519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204078.73537: variable 'omit' from source: magic vars 12372 1727204078.74621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.80876: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.81180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.81184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.81186: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.81318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.81525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.81570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.81608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.81695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.81770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.82139: variable 'ansible_distribution' from source: facts 12372 1727204078.82152: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.82191: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.82200: when evaluation is False, skipping this task 12372 1727204078.82208: _execute() done 12372 1727204078.82214: dumping result to json 12372 1727204078.82227: done dumping result, returning 12372 1727204078.82241: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-244a-02f9-000000000029] 12372 1727204078.82252: sending task result for task 12b410aa-8751-244a-02f9-000000000029 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204078.82427: no more pending results, returning what we have 12372 1727204078.82431: results queue empty 12372 1727204078.82432: checking for any_errors_fatal 12372 1727204078.82439: done checking for any_errors_fatal 12372 1727204078.82440: checking for max_fail_percentage 12372 1727204078.82442: done checking for max_fail_percentage 12372 1727204078.82443: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.82444: done checking to see if all hosts have failed 12372 1727204078.82445: getting the remaining hosts for this loop 12372 1727204078.82447: done getting the remaining hosts for this loop 12372 1727204078.82451: getting the next task for host managed-node3 12372 1727204078.82459: done getting next task for host managed-node3 12372 1727204078.82464: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204078.82467: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.82481: getting variables 12372 1727204078.82484: in VariableManager get_vars() 12372 1727204078.82545: Calling all_inventory to load vars for managed-node3 12372 1727204078.82549: Calling groups_inventory to load vars for managed-node3 12372 1727204078.82551: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.82777: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.82782: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.82787: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.83201: done sending task result for task 12b410aa-8751-244a-02f9-000000000029 12372 1727204078.83204: WORKER PROCESS EXITING 12372 1727204078.83225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.83693: done with get_vars() 12372 1727204078.83707: done getting variables 12372 1727204078.83770: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.124) 0:00:05.825 ***** 12372 1727204078.84015: entering _queue_task() for managed-node3/fail 12372 1727204078.84517: worker is 1 (out of 1 available) 12372 1727204078.84534: exiting _queue_task() for managed-node3/fail 12372 1727204078.84549: done queuing things up, now waiting for results queue to drain 12372 1727204078.84552: waiting for pending results... 12372 1727204078.85122: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204078.85345: in run() - task 12b410aa-8751-244a-02f9-00000000002a 12372 1727204078.85406: variable 'ansible_search_path' from source: unknown 12372 1727204078.85419: variable 'ansible_search_path' from source: unknown 12372 1727204078.85475: calling self._execute() 12372 1727204078.85575: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204078.85599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204078.85620: variable 'omit' from source: magic vars 12372 1727204078.86866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.90873: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.90997: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.91041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.91107: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.91196: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.91272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.91321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.91454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.91510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.91538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.91730: variable 'ansible_distribution' from source: facts 12372 1727204078.91745: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.91781: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.91785: when evaluation is False, skipping this task 12372 1727204078.91787: _execute() done 12372 1727204078.91887: dumping result to json 12372 1727204078.91895: done dumping result, returning 12372 1727204078.91898: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-244a-02f9-00000000002a] 12372 1727204078.91901: sending task result for task 12b410aa-8751-244a-02f9-00000000002a 12372 1727204078.91984: done sending task result for task 12b410aa-8751-244a-02f9-00000000002a 12372 1727204078.91988: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204078.92050: no more pending results, returning what we have 12372 1727204078.92054: results queue empty 12372 1727204078.92055: checking for any_errors_fatal 12372 1727204078.92060: done checking for any_errors_fatal 12372 1727204078.92062: checking for max_fail_percentage 12372 1727204078.92063: done checking for max_fail_percentage 12372 1727204078.92065: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.92066: done checking to see if all hosts have failed 12372 1727204078.92067: getting the remaining hosts for this loop 12372 1727204078.92068: done getting the remaining hosts for this loop 12372 1727204078.92074: getting the next task for host managed-node3 12372 1727204078.92081: done getting next task for host managed-node3 12372 1727204078.92086: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204078.92091: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.92108: getting variables 12372 1727204078.92110: in VariableManager get_vars() 12372 1727204078.92248: Calling all_inventory to load vars for managed-node3 12372 1727204078.92252: Calling groups_inventory to load vars for managed-node3 12372 1727204078.92256: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.92267: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.92270: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.92274: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.92583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.93301: done with get_vars() 12372 1727204078.93314: done getting variables 12372 1727204078.93525: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.095) 0:00:05.921 ***** 12372 1727204078.93563: entering _queue_task() for managed-node3/dnf 12372 1727204078.93995: worker is 1 (out of 1 available) 12372 1727204078.94014: exiting _queue_task() for managed-node3/dnf 12372 1727204078.94066: done queuing things up, now waiting for results queue to drain 12372 1727204078.94068: waiting for pending results... 12372 1727204078.94569: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204078.94576: in run() - task 12b410aa-8751-244a-02f9-00000000002b 12372 1727204078.94580: variable 'ansible_search_path' from source: unknown 12372 1727204078.94583: variable 'ansible_search_path' from source: unknown 12372 1727204078.94587: calling self._execute() 12372 1727204078.94639: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204078.94652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204078.94667: variable 'omit' from source: magic vars 12372 1727204078.95140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204078.97797: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204078.97880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204078.97944: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204078.97966: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204078.98011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204078.98120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204078.98144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204078.98166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204078.98253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204078.98261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204078.98496: variable 'ansible_distribution' from source: facts 12372 1727204078.98499: variable 'ansible_distribution_major_version' from source: facts 12372 1727204078.98501: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204078.98503: when evaluation is False, skipping this task 12372 1727204078.98505: _execute() done 12372 1727204078.98507: dumping result to json 12372 1727204078.98508: done dumping result, returning 12372 1727204078.98511: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-00000000002b] 12372 1727204078.98513: sending task result for task 12b410aa-8751-244a-02f9-00000000002b 12372 1727204078.98586: done sending task result for task 12b410aa-8751-244a-02f9-00000000002b 12372 1727204078.98590: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204078.98644: no more pending results, returning what we have 12372 1727204078.98647: results queue empty 12372 1727204078.98648: checking for any_errors_fatal 12372 1727204078.98654: done checking for any_errors_fatal 12372 1727204078.98655: checking for max_fail_percentage 12372 1727204078.98656: done checking for max_fail_percentage 12372 1727204078.98657: checking to see if all hosts have failed and the running result is not ok 12372 1727204078.98658: done checking to see if all hosts have failed 12372 1727204078.98659: getting the remaining hosts for this loop 12372 1727204078.98661: done getting the remaining hosts for this loop 12372 1727204078.98665: getting the next task for host managed-node3 12372 1727204078.98671: done getting next task for host managed-node3 12372 1727204078.98675: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204078.98678: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204078.98694: getting variables 12372 1727204078.98696: in VariableManager get_vars() 12372 1727204078.98745: Calling all_inventory to load vars for managed-node3 12372 1727204078.98749: Calling groups_inventory to load vars for managed-node3 12372 1727204078.98752: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204078.98762: Calling all_plugins_play to load vars for managed-node3 12372 1727204078.98765: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204078.98768: Calling groups_plugins_play to load vars for managed-node3 12372 1727204078.99023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204078.99593: done with get_vars() 12372 1727204078.99602: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204078.99661: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.061) 0:00:05.982 ***** 12372 1727204078.99687: entering _queue_task() for managed-node3/yum 12372 1727204078.99688: Creating lock for yum 12372 1727204078.99910: worker is 1 (out of 1 available) 12372 1727204078.99923: exiting _queue_task() for managed-node3/yum 12372 1727204078.99936: done queuing things up, now waiting for results queue to drain 12372 1727204078.99938: waiting for pending results... 12372 1727204079.00117: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204079.00219: in run() - task 12b410aa-8751-244a-02f9-00000000002c 12372 1727204079.00234: variable 'ansible_search_path' from source: unknown 12372 1727204079.00238: variable 'ansible_search_path' from source: unknown 12372 1727204079.00277: calling self._execute() 12372 1727204079.00348: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.00357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.00366: variable 'omit' from source: magic vars 12372 1727204079.00968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.05100: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.05360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.05384: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.05486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.05596: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.05799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.06008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.06031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.06092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.06117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.06286: variable 'ansible_distribution' from source: facts 12372 1727204079.06303: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.06318: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.06330: when evaluation is False, skipping this task 12372 1727204079.06341: _execute() done 12372 1727204079.06348: dumping result to json 12372 1727204079.06356: done dumping result, returning 12372 1727204079.06368: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-00000000002c] 12372 1727204079.06377: sending task result for task 12b410aa-8751-244a-02f9-00000000002c 12372 1727204079.06655: done sending task result for task 12b410aa-8751-244a-02f9-00000000002c 12372 1727204079.06658: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.06721: no more pending results, returning what we have 12372 1727204079.06724: results queue empty 12372 1727204079.06725: checking for any_errors_fatal 12372 1727204079.06735: done checking for any_errors_fatal 12372 1727204079.06736: checking for max_fail_percentage 12372 1727204079.06737: done checking for max_fail_percentage 12372 1727204079.06738: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.06739: done checking to see if all hosts have failed 12372 1727204079.06740: getting the remaining hosts for this loop 12372 1727204079.06742: done getting the remaining hosts for this loop 12372 1727204079.06746: getting the next task for host managed-node3 12372 1727204079.06754: done getting next task for host managed-node3 12372 1727204079.06760: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204079.06764: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.06780: getting variables 12372 1727204079.06782: in VariableManager get_vars() 12372 1727204079.06846: Calling all_inventory to load vars for managed-node3 12372 1727204079.06850: Calling groups_inventory to load vars for managed-node3 12372 1727204079.06852: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.06862: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.06865: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.06868: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.07041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.07222: done with get_vars() 12372 1727204079.07232: done getting variables 12372 1727204079.07280: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.076) 0:00:06.058 ***** 12372 1727204079.07310: entering _queue_task() for managed-node3/fail 12372 1727204079.07536: worker is 1 (out of 1 available) 12372 1727204079.07552: exiting _queue_task() for managed-node3/fail 12372 1727204079.07564: done queuing things up, now waiting for results queue to drain 12372 1727204079.07566: waiting for pending results... 12372 1727204079.07752: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204079.07857: in run() - task 12b410aa-8751-244a-02f9-00000000002d 12372 1727204079.07870: variable 'ansible_search_path' from source: unknown 12372 1727204079.07873: variable 'ansible_search_path' from source: unknown 12372 1727204079.07912: calling self._execute() 12372 1727204079.07991: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.07996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.08006: variable 'omit' from source: magic vars 12372 1727204079.08374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.10149: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.10207: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.10241: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.10270: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.10293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.10365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.10388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.10413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.10451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.10464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.10578: variable 'ansible_distribution' from source: facts 12372 1727204079.10582: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.10595: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.10598: when evaluation is False, skipping this task 12372 1727204079.10601: _execute() done 12372 1727204079.10606: dumping result to json 12372 1727204079.10611: done dumping result, returning 12372 1727204079.10625: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-00000000002d] 12372 1727204079.10628: sending task result for task 12b410aa-8751-244a-02f9-00000000002d 12372 1727204079.10722: done sending task result for task 12b410aa-8751-244a-02f9-00000000002d 12372 1727204079.10725: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.10781: no more pending results, returning what we have 12372 1727204079.10784: results queue empty 12372 1727204079.10785: checking for any_errors_fatal 12372 1727204079.10800: done checking for any_errors_fatal 12372 1727204079.10801: checking for max_fail_percentage 12372 1727204079.10803: done checking for max_fail_percentage 12372 1727204079.10804: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.10805: done checking to see if all hosts have failed 12372 1727204079.10806: getting the remaining hosts for this loop 12372 1727204079.10807: done getting the remaining hosts for this loop 12372 1727204079.10812: getting the next task for host managed-node3 12372 1727204079.10818: done getting next task for host managed-node3 12372 1727204079.10822: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12372 1727204079.10825: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.10840: getting variables 12372 1727204079.10842: in VariableManager get_vars() 12372 1727204079.10894: Calling all_inventory to load vars for managed-node3 12372 1727204079.10898: Calling groups_inventory to load vars for managed-node3 12372 1727204079.10907: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.10917: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.10920: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.10923: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.11109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.11280: done with get_vars() 12372 1727204079.11291: done getting variables 12372 1727204079.11337: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.040) 0:00:06.099 ***** 12372 1727204079.11365: entering _queue_task() for managed-node3/package 12372 1727204079.11567: worker is 1 (out of 1 available) 12372 1727204079.11581: exiting _queue_task() for managed-node3/package 12372 1727204079.11596: done queuing things up, now waiting for results queue to drain 12372 1727204079.11598: waiting for pending results... 12372 1727204079.11767: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 12372 1727204079.11878: in run() - task 12b410aa-8751-244a-02f9-00000000002e 12372 1727204079.11893: variable 'ansible_search_path' from source: unknown 12372 1727204079.11899: variable 'ansible_search_path' from source: unknown 12372 1727204079.11933: calling self._execute() 12372 1727204079.12007: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.12014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.12026: variable 'omit' from source: magic vars 12372 1727204079.12390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.14132: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.14193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.14224: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.14258: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.14281: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.14353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.14379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.14402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.14436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.14454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.14563: variable 'ansible_distribution' from source: facts 12372 1727204079.14567: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.14580: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.14584: when evaluation is False, skipping this task 12372 1727204079.14586: _execute() done 12372 1727204079.14592: dumping result to json 12372 1727204079.14598: done dumping result, returning 12372 1727204079.14606: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-244a-02f9-00000000002e] 12372 1727204079.14612: sending task result for task 12b410aa-8751-244a-02f9-00000000002e skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.14769: no more pending results, returning what we have 12372 1727204079.14773: results queue empty 12372 1727204079.14775: checking for any_errors_fatal 12372 1727204079.14780: done checking for any_errors_fatal 12372 1727204079.14781: checking for max_fail_percentage 12372 1727204079.14782: done checking for max_fail_percentage 12372 1727204079.14783: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.14785: done checking to see if all hosts have failed 12372 1727204079.14786: getting the remaining hosts for this loop 12372 1727204079.14787: done getting the remaining hosts for this loop 12372 1727204079.14793: getting the next task for host managed-node3 12372 1727204079.14799: done getting next task for host managed-node3 12372 1727204079.14805: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204079.14809: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.14825: getting variables 12372 1727204079.14827: in VariableManager get_vars() 12372 1727204079.14876: Calling all_inventory to load vars for managed-node3 12372 1727204079.14880: Calling groups_inventory to load vars for managed-node3 12372 1727204079.14882: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.14899: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.14903: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.14909: done sending task result for task 12b410aa-8751-244a-02f9-00000000002e 12372 1727204079.14912: WORKER PROCESS EXITING 12372 1727204079.14918: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.15064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.15243: done with get_vars() 12372 1727204079.15253: done getting variables 12372 1727204079.15301: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.039) 0:00:06.138 ***** 12372 1727204079.15328: entering _queue_task() for managed-node3/package 12372 1727204079.15541: worker is 1 (out of 1 available) 12372 1727204079.15557: exiting _queue_task() for managed-node3/package 12372 1727204079.15569: done queuing things up, now waiting for results queue to drain 12372 1727204079.15572: waiting for pending results... 12372 1727204079.15753: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204079.15853: in run() - task 12b410aa-8751-244a-02f9-00000000002f 12372 1727204079.15866: variable 'ansible_search_path' from source: unknown 12372 1727204079.15871: variable 'ansible_search_path' from source: unknown 12372 1727204079.15904: calling self._execute() 12372 1727204079.15972: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.15979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.15991: variable 'omit' from source: magic vars 12372 1727204079.16346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.18092: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.18147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.18176: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.18214: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.18234: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.18305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.18334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.18356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.18388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.18402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.18518: variable 'ansible_distribution' from source: facts 12372 1727204079.18522: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.18533: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.18538: when evaluation is False, skipping this task 12372 1727204079.18540: _execute() done 12372 1727204079.18543: dumping result to json 12372 1727204079.18552: done dumping result, returning 12372 1727204079.18557: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-244a-02f9-00000000002f] 12372 1727204079.18563: sending task result for task 12b410aa-8751-244a-02f9-00000000002f 12372 1727204079.18664: done sending task result for task 12b410aa-8751-244a-02f9-00000000002f 12372 1727204079.18667: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.18717: no more pending results, returning what we have 12372 1727204079.18722: results queue empty 12372 1727204079.18723: checking for any_errors_fatal 12372 1727204079.18730: done checking for any_errors_fatal 12372 1727204079.18731: checking for max_fail_percentage 12372 1727204079.18733: done checking for max_fail_percentage 12372 1727204079.18734: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.18735: done checking to see if all hosts have failed 12372 1727204079.18736: getting the remaining hosts for this loop 12372 1727204079.18738: done getting the remaining hosts for this loop 12372 1727204079.18742: getting the next task for host managed-node3 12372 1727204079.18749: done getting next task for host managed-node3 12372 1727204079.18753: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204079.18756: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.18771: getting variables 12372 1727204079.18772: in VariableManager get_vars() 12372 1727204079.18835: Calling all_inventory to load vars for managed-node3 12372 1727204079.18838: Calling groups_inventory to load vars for managed-node3 12372 1727204079.18841: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.18850: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.18853: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.18857: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.19038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.19205: done with get_vars() 12372 1727204079.19214: done getting variables 12372 1727204079.19263: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.039) 0:00:06.178 ***** 12372 1727204079.19288: entering _queue_task() for managed-node3/package 12372 1727204079.19494: worker is 1 (out of 1 available) 12372 1727204079.19508: exiting _queue_task() for managed-node3/package 12372 1727204079.19520: done queuing things up, now waiting for results queue to drain 12372 1727204079.19522: waiting for pending results... 12372 1727204079.19703: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204079.19802: in run() - task 12b410aa-8751-244a-02f9-000000000030 12372 1727204079.19815: variable 'ansible_search_path' from source: unknown 12372 1727204079.19822: variable 'ansible_search_path' from source: unknown 12372 1727204079.19855: calling self._execute() 12372 1727204079.19932: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.19938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.19948: variable 'omit' from source: magic vars 12372 1727204079.20319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.22092: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.22151: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.22184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.22215: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.22241: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.22312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.22340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.22363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.22401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.22414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.22533: variable 'ansible_distribution' from source: facts 12372 1727204079.22539: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.22549: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.22552: when evaluation is False, skipping this task 12372 1727204079.22556: _execute() done 12372 1727204079.22560: dumping result to json 12372 1727204079.22565: done dumping result, returning 12372 1727204079.22574: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-244a-02f9-000000000030] 12372 1727204079.22579: sending task result for task 12b410aa-8751-244a-02f9-000000000030 12372 1727204079.22679: done sending task result for task 12b410aa-8751-244a-02f9-000000000030 12372 1727204079.22682: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.22755: no more pending results, returning what we have 12372 1727204079.22759: results queue empty 12372 1727204079.22760: checking for any_errors_fatal 12372 1727204079.22768: done checking for any_errors_fatal 12372 1727204079.22769: checking for max_fail_percentage 12372 1727204079.22771: done checking for max_fail_percentage 12372 1727204079.22772: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.22773: done checking to see if all hosts have failed 12372 1727204079.22774: getting the remaining hosts for this loop 12372 1727204079.22775: done getting the remaining hosts for this loop 12372 1727204079.22779: getting the next task for host managed-node3 12372 1727204079.22786: done getting next task for host managed-node3 12372 1727204079.22792: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204079.22795: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.22812: getting variables 12372 1727204079.22814: in VariableManager get_vars() 12372 1727204079.22865: Calling all_inventory to load vars for managed-node3 12372 1727204079.22868: Calling groups_inventory to load vars for managed-node3 12372 1727204079.22871: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.22881: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.22884: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.22888: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.23042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.23214: done with get_vars() 12372 1727204079.23226: done getting variables 12372 1727204079.23305: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.040) 0:00:06.219 ***** 12372 1727204079.23335: entering _queue_task() for managed-node3/service 12372 1727204079.23337: Creating lock for service 12372 1727204079.23551: worker is 1 (out of 1 available) 12372 1727204079.23567: exiting _queue_task() for managed-node3/service 12372 1727204079.23581: done queuing things up, now waiting for results queue to drain 12372 1727204079.23583: waiting for pending results... 12372 1727204079.23764: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204079.23865: in run() - task 12b410aa-8751-244a-02f9-000000000031 12372 1727204079.23877: variable 'ansible_search_path' from source: unknown 12372 1727204079.23881: variable 'ansible_search_path' from source: unknown 12372 1727204079.23918: calling self._execute() 12372 1727204079.23991: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.23998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.24008: variable 'omit' from source: magic vars 12372 1727204079.24429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.26172: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.26233: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.26273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.26307: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.26333: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.26401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.26432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.26454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.26486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.26500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.26614: variable 'ansible_distribution' from source: facts 12372 1727204079.26621: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.26633: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.26636: when evaluation is False, skipping this task 12372 1727204079.26638: _execute() done 12372 1727204079.26643: dumping result to json 12372 1727204079.26650: done dumping result, returning 12372 1727204079.26659: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000031] 12372 1727204079.26663: sending task result for task 12b410aa-8751-244a-02f9-000000000031 12372 1727204079.26766: done sending task result for task 12b410aa-8751-244a-02f9-000000000031 12372 1727204079.26769: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.26819: no more pending results, returning what we have 12372 1727204079.26823: results queue empty 12372 1727204079.26824: checking for any_errors_fatal 12372 1727204079.26832: done checking for any_errors_fatal 12372 1727204079.26833: checking for max_fail_percentage 12372 1727204079.26835: done checking for max_fail_percentage 12372 1727204079.26836: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.26837: done checking to see if all hosts have failed 12372 1727204079.26838: getting the remaining hosts for this loop 12372 1727204079.26839: done getting the remaining hosts for this loop 12372 1727204079.26843: getting the next task for host managed-node3 12372 1727204079.26850: done getting next task for host managed-node3 12372 1727204079.26854: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204079.26857: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.26873: getting variables 12372 1727204079.26874: in VariableManager get_vars() 12372 1727204079.26929: Calling all_inventory to load vars for managed-node3 12372 1727204079.26933: Calling groups_inventory to load vars for managed-node3 12372 1727204079.26935: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.26945: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.26947: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.26950: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.27145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.27314: done with get_vars() 12372 1727204079.27327: done getting variables 12372 1727204079.27373: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.040) 0:00:06.259 ***** 12372 1727204079.27400: entering _queue_task() for managed-node3/service 12372 1727204079.27615: worker is 1 (out of 1 available) 12372 1727204079.27632: exiting _queue_task() for managed-node3/service 12372 1727204079.27645: done queuing things up, now waiting for results queue to drain 12372 1727204079.27647: waiting for pending results... 12372 1727204079.27822: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204079.27924: in run() - task 12b410aa-8751-244a-02f9-000000000032 12372 1727204079.27937: variable 'ansible_search_path' from source: unknown 12372 1727204079.27941: variable 'ansible_search_path' from source: unknown 12372 1727204079.27973: calling self._execute() 12372 1727204079.28048: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.28054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.28064: variable 'omit' from source: magic vars 12372 1727204079.28438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.30204: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.30260: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.30294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.30327: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.30351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.30425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.30448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.30469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.30508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.30523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.30635: variable 'ansible_distribution' from source: facts 12372 1727204079.30642: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.30652: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.30655: when evaluation is False, skipping this task 12372 1727204079.30659: _execute() done 12372 1727204079.30662: dumping result to json 12372 1727204079.30668: done dumping result, returning 12372 1727204079.30676: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-244a-02f9-000000000032] 12372 1727204079.30682: sending task result for task 12b410aa-8751-244a-02f9-000000000032 12372 1727204079.30778: done sending task result for task 12b410aa-8751-244a-02f9-000000000032 12372 1727204079.30781: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204079.30852: no more pending results, returning what we have 12372 1727204079.30856: results queue empty 12372 1727204079.30857: checking for any_errors_fatal 12372 1727204079.30863: done checking for any_errors_fatal 12372 1727204079.30864: checking for max_fail_percentage 12372 1727204079.30866: done checking for max_fail_percentage 12372 1727204079.30867: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.30868: done checking to see if all hosts have failed 12372 1727204079.30869: getting the remaining hosts for this loop 12372 1727204079.30870: done getting the remaining hosts for this loop 12372 1727204079.30874: getting the next task for host managed-node3 12372 1727204079.30880: done getting next task for host managed-node3 12372 1727204079.30884: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204079.30888: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.30905: getting variables 12372 1727204079.30907: in VariableManager get_vars() 12372 1727204079.30955: Calling all_inventory to load vars for managed-node3 12372 1727204079.30959: Calling groups_inventory to load vars for managed-node3 12372 1727204079.30962: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.30971: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.30974: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.30977: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.31129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.31326: done with get_vars() 12372 1727204079.31335: done getting variables 12372 1727204079.31381: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.040) 0:00:06.299 ***** 12372 1727204079.31408: entering _queue_task() for managed-node3/service 12372 1727204079.31619: worker is 1 (out of 1 available) 12372 1727204079.31634: exiting _queue_task() for managed-node3/service 12372 1727204079.31646: done queuing things up, now waiting for results queue to drain 12372 1727204079.31648: waiting for pending results... 12372 1727204079.31823: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204079.31924: in run() - task 12b410aa-8751-244a-02f9-000000000033 12372 1727204079.31936: variable 'ansible_search_path' from source: unknown 12372 1727204079.31941: variable 'ansible_search_path' from source: unknown 12372 1727204079.31973: calling self._execute() 12372 1727204079.32044: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.32051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.32061: variable 'omit' from source: magic vars 12372 1727204079.32426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.34154: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.34222: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.34251: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.34287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.34309: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.34378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.34406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.34429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.34461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.34474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.34584: variable 'ansible_distribution' from source: facts 12372 1727204079.34591: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.34603: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.34606: when evaluation is False, skipping this task 12372 1727204079.34609: _execute() done 12372 1727204079.34613: dumping result to json 12372 1727204079.34626: done dumping result, returning 12372 1727204079.34629: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-244a-02f9-000000000033] 12372 1727204079.34632: sending task result for task 12b410aa-8751-244a-02f9-000000000033 12372 1727204079.34730: done sending task result for task 12b410aa-8751-244a-02f9-000000000033 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.34782: no more pending results, returning what we have 12372 1727204079.34786: results queue empty 12372 1727204079.34787: checking for any_errors_fatal 12372 1727204079.34796: done checking for any_errors_fatal 12372 1727204079.34797: checking for max_fail_percentage 12372 1727204079.34799: done checking for max_fail_percentage 12372 1727204079.34800: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.34801: done checking to see if all hosts have failed 12372 1727204079.34802: getting the remaining hosts for this loop 12372 1727204079.34803: done getting the remaining hosts for this loop 12372 1727204079.34807: getting the next task for host managed-node3 12372 1727204079.34813: done getting next task for host managed-node3 12372 1727204079.34820: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204079.34823: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.34839: getting variables 12372 1727204079.34840: in VariableManager get_vars() 12372 1727204079.34899: Calling all_inventory to load vars for managed-node3 12372 1727204079.34902: Calling groups_inventory to load vars for managed-node3 12372 1727204079.34905: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.34912: WORKER PROCESS EXITING 12372 1727204079.34923: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.34926: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.34929: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.35071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.35247: done with get_vars() 12372 1727204079.35256: done getting variables 12372 1727204079.35304: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.039) 0:00:06.339 ***** 12372 1727204079.35332: entering _queue_task() for managed-node3/service 12372 1727204079.35534: worker is 1 (out of 1 available) 12372 1727204079.35550: exiting _queue_task() for managed-node3/service 12372 1727204079.35562: done queuing things up, now waiting for results queue to drain 12372 1727204079.35564: waiting for pending results... 12372 1727204079.35745: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204079.35848: in run() - task 12b410aa-8751-244a-02f9-000000000034 12372 1727204079.35859: variable 'ansible_search_path' from source: unknown 12372 1727204079.35863: variable 'ansible_search_path' from source: unknown 12372 1727204079.35912: calling self._execute() 12372 1727204079.35967: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.35973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.35984: variable 'omit' from source: magic vars 12372 1727204079.36353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.38143: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.38201: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.38236: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.38265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.38288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.38361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.38384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.38410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.38447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.38460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.38572: variable 'ansible_distribution' from source: facts 12372 1727204079.38577: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.38588: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.38593: when evaluation is False, skipping this task 12372 1727204079.38596: _execute() done 12372 1727204079.38601: dumping result to json 12372 1727204079.38606: done dumping result, returning 12372 1727204079.38615: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-244a-02f9-000000000034] 12372 1727204079.38625: sending task result for task 12b410aa-8751-244a-02f9-000000000034 12372 1727204079.38717: done sending task result for task 12b410aa-8751-244a-02f9-000000000034 12372 1727204079.38720: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204079.38770: no more pending results, returning what we have 12372 1727204079.38774: results queue empty 12372 1727204079.38775: checking for any_errors_fatal 12372 1727204079.38783: done checking for any_errors_fatal 12372 1727204079.38784: checking for max_fail_percentage 12372 1727204079.38787: done checking for max_fail_percentage 12372 1727204079.38788: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.38796: done checking to see if all hosts have failed 12372 1727204079.38797: getting the remaining hosts for this loop 12372 1727204079.38799: done getting the remaining hosts for this loop 12372 1727204079.38804: getting the next task for host managed-node3 12372 1727204079.38810: done getting next task for host managed-node3 12372 1727204079.38815: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204079.38818: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.38833: getting variables 12372 1727204079.38835: in VariableManager get_vars() 12372 1727204079.38885: Calling all_inventory to load vars for managed-node3 12372 1727204079.38888: Calling groups_inventory to load vars for managed-node3 12372 1727204079.38893: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.38908: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.38912: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.38916: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.39101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.39270: done with get_vars() 12372 1727204079.39279: done getting variables 12372 1727204079.39327: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.040) 0:00:06.379 ***** 12372 1727204079.39355: entering _queue_task() for managed-node3/copy 12372 1727204079.39562: worker is 1 (out of 1 available) 12372 1727204079.39578: exiting _queue_task() for managed-node3/copy 12372 1727204079.39593: done queuing things up, now waiting for results queue to drain 12372 1727204079.39595: waiting for pending results... 12372 1727204079.39768: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204079.39869: in run() - task 12b410aa-8751-244a-02f9-000000000035 12372 1727204079.39880: variable 'ansible_search_path' from source: unknown 12372 1727204079.39883: variable 'ansible_search_path' from source: unknown 12372 1727204079.39924: calling self._execute() 12372 1727204079.40000: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.40007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.40020: variable 'omit' from source: magic vars 12372 1727204079.40434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.42146: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.42213: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.42245: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.42281: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.42306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.42377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.42404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.42428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.42459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.42474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.42794: variable 'ansible_distribution' from source: facts 12372 1727204079.42798: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.42800: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.42802: when evaluation is False, skipping this task 12372 1727204079.42805: _execute() done 12372 1727204079.42807: dumping result to json 12372 1727204079.42809: done dumping result, returning 12372 1727204079.42811: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-244a-02f9-000000000035] 12372 1727204079.42813: sending task result for task 12b410aa-8751-244a-02f9-000000000035 12372 1727204079.42905: done sending task result for task 12b410aa-8751-244a-02f9-000000000035 12372 1727204079.42909: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.42957: no more pending results, returning what we have 12372 1727204079.42960: results queue empty 12372 1727204079.42961: checking for any_errors_fatal 12372 1727204079.42969: done checking for any_errors_fatal 12372 1727204079.42969: checking for max_fail_percentage 12372 1727204079.42972: done checking for max_fail_percentage 12372 1727204079.42973: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.42974: done checking to see if all hosts have failed 12372 1727204079.42975: getting the remaining hosts for this loop 12372 1727204079.42976: done getting the remaining hosts for this loop 12372 1727204079.42982: getting the next task for host managed-node3 12372 1727204079.42989: done getting next task for host managed-node3 12372 1727204079.42995: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204079.42998: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.43012: getting variables 12372 1727204079.43014: in VariableManager get_vars() 12372 1727204079.43062: Calling all_inventory to load vars for managed-node3 12372 1727204079.43066: Calling groups_inventory to load vars for managed-node3 12372 1727204079.43069: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.43078: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.43081: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.43085: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.43330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.43629: done with get_vars() 12372 1727204079.43642: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.043) 0:00:06.423 ***** 12372 1727204079.43747: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204079.43749: Creating lock for fedora.linux_system_roles.network_connections 12372 1727204079.44019: worker is 1 (out of 1 available) 12372 1727204079.44034: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204079.44047: done queuing things up, now waiting for results queue to drain 12372 1727204079.44049: waiting for pending results... 12372 1727204079.44338: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204079.44480: in run() - task 12b410aa-8751-244a-02f9-000000000036 12372 1727204079.44494: variable 'ansible_search_path' from source: unknown 12372 1727204079.44497: variable 'ansible_search_path' from source: unknown 12372 1727204079.44536: calling self._execute() 12372 1727204079.44611: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.44626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.44633: variable 'omit' from source: magic vars 12372 1727204079.44993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.46880: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.47104: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.47108: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.47111: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.47113: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.47195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.47237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.47275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.47333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.47356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.47517: variable 'ansible_distribution' from source: facts 12372 1727204079.47532: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.47550: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.47561: when evaluation is False, skipping this task 12372 1727204079.47564: _execute() done 12372 1727204079.47568: dumping result to json 12372 1727204079.47577: done dumping result, returning 12372 1727204079.47593: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-244a-02f9-000000000036] 12372 1727204079.47603: sending task result for task 12b410aa-8751-244a-02f9-000000000036 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.47774: no more pending results, returning what we have 12372 1727204079.47778: results queue empty 12372 1727204079.47779: checking for any_errors_fatal 12372 1727204079.47835: done checking for any_errors_fatal 12372 1727204079.47836: checking for max_fail_percentage 12372 1727204079.47838: done checking for max_fail_percentage 12372 1727204079.47839: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.47840: done checking to see if all hosts have failed 12372 1727204079.47841: getting the remaining hosts for this loop 12372 1727204079.47843: done getting the remaining hosts for this loop 12372 1727204079.47848: getting the next task for host managed-node3 12372 1727204079.47855: done getting next task for host managed-node3 12372 1727204079.47859: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204079.47862: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.47892: done sending task result for task 12b410aa-8751-244a-02f9-000000000036 12372 1727204079.47895: WORKER PROCESS EXITING 12372 1727204079.47907: getting variables 12372 1727204079.47909: in VariableManager get_vars() 12372 1727204079.47965: Calling all_inventory to load vars for managed-node3 12372 1727204079.47968: Calling groups_inventory to load vars for managed-node3 12372 1727204079.47971: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.48005: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.48010: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.48015: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.48288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.48563: done with get_vars() 12372 1727204079.48575: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.049) 0:00:06.472 ***** 12372 1727204079.48672: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204079.48674: Creating lock for fedora.linux_system_roles.network_state 12372 1727204079.48939: worker is 1 (out of 1 available) 12372 1727204079.48952: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204079.48964: done queuing things up, now waiting for results queue to drain 12372 1727204079.48965: waiting for pending results... 12372 1727204079.49409: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204079.49414: in run() - task 12b410aa-8751-244a-02f9-000000000037 12372 1727204079.49418: variable 'ansible_search_path' from source: unknown 12372 1727204079.49428: variable 'ansible_search_path' from source: unknown 12372 1727204079.49472: calling self._execute() 12372 1727204079.49572: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.49586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.49606: variable 'omit' from source: magic vars 12372 1727204079.50126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.52674: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.52773: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.52828: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.52876: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.52920: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.53026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.53136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.53140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.53168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.53193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.53373: variable 'ansible_distribution' from source: facts 12372 1727204079.53387: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.53407: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.53415: when evaluation is False, skipping this task 12372 1727204079.53424: _execute() done 12372 1727204079.53431: dumping result to json 12372 1727204079.53441: done dumping result, returning 12372 1727204079.53458: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-244a-02f9-000000000037] 12372 1727204079.53473: sending task result for task 12b410aa-8751-244a-02f9-000000000037 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.53650: no more pending results, returning what we have 12372 1727204079.53654: results queue empty 12372 1727204079.53656: checking for any_errors_fatal 12372 1727204079.53663: done checking for any_errors_fatal 12372 1727204079.53664: checking for max_fail_percentage 12372 1727204079.53666: done checking for max_fail_percentage 12372 1727204079.53667: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.53668: done checking to see if all hosts have failed 12372 1727204079.53669: getting the remaining hosts for this loop 12372 1727204079.53671: done getting the remaining hosts for this loop 12372 1727204079.53676: getting the next task for host managed-node3 12372 1727204079.53685: done getting next task for host managed-node3 12372 1727204079.53692: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204079.53696: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.53712: getting variables 12372 1727204079.53715: in VariableManager get_vars() 12372 1727204079.53781: Calling all_inventory to load vars for managed-node3 12372 1727204079.53785: Calling groups_inventory to load vars for managed-node3 12372 1727204079.53788: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.54002: done sending task result for task 12b410aa-8751-244a-02f9-000000000037 12372 1727204079.54005: WORKER PROCESS EXITING 12372 1727204079.54015: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.54018: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.54022: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.54373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.54659: done with get_vars() 12372 1727204079.54672: done getting variables 12372 1727204079.54742: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.061) 0:00:06.533 ***** 12372 1727204079.54783: entering _queue_task() for managed-node3/debug 12372 1727204079.55075: worker is 1 (out of 1 available) 12372 1727204079.55294: exiting _queue_task() for managed-node3/debug 12372 1727204079.55305: done queuing things up, now waiting for results queue to drain 12372 1727204079.55307: waiting for pending results... 12372 1727204079.55399: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204079.55642: in run() - task 12b410aa-8751-244a-02f9-000000000038 12372 1727204079.55647: variable 'ansible_search_path' from source: unknown 12372 1727204079.55650: variable 'ansible_search_path' from source: unknown 12372 1727204079.55653: calling self._execute() 12372 1727204079.55757: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.55772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.55792: variable 'omit' from source: magic vars 12372 1727204079.56434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.59049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.59140: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.59232: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.59247: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.59280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.59385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.59430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.59695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.59699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.59702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.59718: variable 'ansible_distribution' from source: facts 12372 1727204079.59732: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.59750: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.59759: when evaluation is False, skipping this task 12372 1727204079.59767: _execute() done 12372 1727204079.59775: dumping result to json 12372 1727204079.59784: done dumping result, returning 12372 1727204079.59802: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-244a-02f9-000000000038] 12372 1727204079.59816: sending task result for task 12b410aa-8751-244a-02f9-000000000038 skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204079.59979: no more pending results, returning what we have 12372 1727204079.59983: results queue empty 12372 1727204079.59984: checking for any_errors_fatal 12372 1727204079.59994: done checking for any_errors_fatal 12372 1727204079.59995: checking for max_fail_percentage 12372 1727204079.59998: done checking for max_fail_percentage 12372 1727204079.59999: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.60000: done checking to see if all hosts have failed 12372 1727204079.60001: getting the remaining hosts for this loop 12372 1727204079.60003: done getting the remaining hosts for this loop 12372 1727204079.60008: getting the next task for host managed-node3 12372 1727204079.60016: done getting next task for host managed-node3 12372 1727204079.60021: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204079.60024: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.60040: getting variables 12372 1727204079.60042: in VariableManager get_vars() 12372 1727204079.60212: Calling all_inventory to load vars for managed-node3 12372 1727204079.60216: Calling groups_inventory to load vars for managed-node3 12372 1727204079.60219: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.60232: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.60236: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.60240: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.60725: done sending task result for task 12b410aa-8751-244a-02f9-000000000038 12372 1727204079.60728: WORKER PROCESS EXITING 12372 1727204079.60758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.61049: done with get_vars() 12372 1727204079.61063: done getting variables 12372 1727204079.61133: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.063) 0:00:06.597 ***** 12372 1727204079.61171: entering _queue_task() for managed-node3/debug 12372 1727204079.61457: worker is 1 (out of 1 available) 12372 1727204079.61473: exiting _queue_task() for managed-node3/debug 12372 1727204079.61487: done queuing things up, now waiting for results queue to drain 12372 1727204079.61490: waiting for pending results... 12372 1727204079.61847: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204079.62002: in run() - task 12b410aa-8751-244a-02f9-000000000039 12372 1727204079.62036: variable 'ansible_search_path' from source: unknown 12372 1727204079.62046: variable 'ansible_search_path' from source: unknown 12372 1727204079.62095: calling self._execute() 12372 1727204079.62200: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.62232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.62266: variable 'omit' from source: magic vars 12372 1727204079.62671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.64601: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.64661: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.64710: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.64757: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.64796: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.64899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.64996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.64999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.65034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.65058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.65226: variable 'ansible_distribution' from source: facts 12372 1727204079.65245: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.65258: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.65262: when evaluation is False, skipping this task 12372 1727204079.65307: _execute() done 12372 1727204079.65314: dumping result to json 12372 1727204079.65320: done dumping result, returning 12372 1727204079.65324: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-244a-02f9-000000000039] 12372 1727204079.65326: sending task result for task 12b410aa-8751-244a-02f9-000000000039 12372 1727204079.65396: done sending task result for task 12b410aa-8751-244a-02f9-000000000039 12372 1727204079.65399: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204079.65478: no more pending results, returning what we have 12372 1727204079.65482: results queue empty 12372 1727204079.65483: checking for any_errors_fatal 12372 1727204079.65491: done checking for any_errors_fatal 12372 1727204079.65492: checking for max_fail_percentage 12372 1727204079.65494: done checking for max_fail_percentage 12372 1727204079.65495: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.65496: done checking to see if all hosts have failed 12372 1727204079.65497: getting the remaining hosts for this loop 12372 1727204079.65499: done getting the remaining hosts for this loop 12372 1727204079.65503: getting the next task for host managed-node3 12372 1727204079.65510: done getting next task for host managed-node3 12372 1727204079.65514: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204079.65526: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.65541: getting variables 12372 1727204079.65543: in VariableManager get_vars() 12372 1727204079.65596: Calling all_inventory to load vars for managed-node3 12372 1727204079.65600: Calling groups_inventory to load vars for managed-node3 12372 1727204079.65603: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.65612: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.65618: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.65621: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.65769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.65940: done with get_vars() 12372 1727204079.65950: done getting variables 12372 1727204079.66002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.048) 0:00:06.646 ***** 12372 1727204079.66030: entering _queue_task() for managed-node3/debug 12372 1727204079.66240: worker is 1 (out of 1 available) 12372 1727204079.66255: exiting _queue_task() for managed-node3/debug 12372 1727204079.66268: done queuing things up, now waiting for results queue to drain 12372 1727204079.66270: waiting for pending results... 12372 1727204079.66454: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204079.66550: in run() - task 12b410aa-8751-244a-02f9-00000000003a 12372 1727204079.66562: variable 'ansible_search_path' from source: unknown 12372 1727204079.66565: variable 'ansible_search_path' from source: unknown 12372 1727204079.66599: calling self._execute() 12372 1727204079.66673: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.66678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.66691: variable 'omit' from source: magic vars 12372 1727204079.67118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.69861: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.69876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.69927: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.69985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.70025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.70134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.70178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.70225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.70282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.70318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.70495: variable 'ansible_distribution' from source: facts 12372 1727204079.70510: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.70536: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.70695: when evaluation is False, skipping this task 12372 1727204079.70699: _execute() done 12372 1727204079.70702: dumping result to json 12372 1727204079.70705: done dumping result, returning 12372 1727204079.70708: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-244a-02f9-00000000003a] 12372 1727204079.70710: sending task result for task 12b410aa-8751-244a-02f9-00000000003a 12372 1727204079.70792: done sending task result for task 12b410aa-8751-244a-02f9-00000000003a 12372 1727204079.70797: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204079.70848: no more pending results, returning what we have 12372 1727204079.70852: results queue empty 12372 1727204079.70853: checking for any_errors_fatal 12372 1727204079.70860: done checking for any_errors_fatal 12372 1727204079.70861: checking for max_fail_percentage 12372 1727204079.70863: done checking for max_fail_percentage 12372 1727204079.70864: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.70865: done checking to see if all hosts have failed 12372 1727204079.70866: getting the remaining hosts for this loop 12372 1727204079.70867: done getting the remaining hosts for this loop 12372 1727204079.70871: getting the next task for host managed-node3 12372 1727204079.70880: done getting next task for host managed-node3 12372 1727204079.70885: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204079.70888: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.70905: getting variables 12372 1727204079.70907: in VariableManager get_vars() 12372 1727204079.71084: Calling all_inventory to load vars for managed-node3 12372 1727204079.71088: Calling groups_inventory to load vars for managed-node3 12372 1727204079.71093: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.71103: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.71107: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.71112: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.71356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.71681: done with get_vars() 12372 1727204079.71696: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.057) 0:00:06.703 ***** 12372 1727204079.71812: entering _queue_task() for managed-node3/ping 12372 1727204079.71814: Creating lock for ping 12372 1727204079.72194: worker is 1 (out of 1 available) 12372 1727204079.72207: exiting _queue_task() for managed-node3/ping 12372 1727204079.72330: done queuing things up, now waiting for results queue to drain 12372 1727204079.72333: waiting for pending results... 12372 1727204079.72576: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204079.72605: in run() - task 12b410aa-8751-244a-02f9-00000000003b 12372 1727204079.72627: variable 'ansible_search_path' from source: unknown 12372 1727204079.72636: variable 'ansible_search_path' from source: unknown 12372 1727204079.72687: calling self._execute() 12372 1727204079.72796: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.72894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.72898: variable 'omit' from source: magic vars 12372 1727204079.73380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.76226: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.76327: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.76384: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.76438: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.76475: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.76583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.76795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.76799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.76801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.76804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.76934: variable 'ansible_distribution' from source: facts 12372 1727204079.76947: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.76962: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.76970: when evaluation is False, skipping this task 12372 1727204079.76977: _execute() done 12372 1727204079.76984: dumping result to json 12372 1727204079.76995: done dumping result, returning 12372 1727204079.77008: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-244a-02f9-00000000003b] 12372 1727204079.77019: sending task result for task 12b410aa-8751-244a-02f9-00000000003b skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.77188: no more pending results, returning what we have 12372 1727204079.77194: results queue empty 12372 1727204079.77196: checking for any_errors_fatal 12372 1727204079.77205: done checking for any_errors_fatal 12372 1727204079.77206: checking for max_fail_percentage 12372 1727204079.77209: done checking for max_fail_percentage 12372 1727204079.77210: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.77211: done checking to see if all hosts have failed 12372 1727204079.77212: getting the remaining hosts for this loop 12372 1727204079.77214: done getting the remaining hosts for this loop 12372 1727204079.77218: getting the next task for host managed-node3 12372 1727204079.77229: done getting next task for host managed-node3 12372 1727204079.77233: ^ task is: TASK: meta (role_complete) 12372 1727204079.77237: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.77406: getting variables 12372 1727204079.77409: in VariableManager get_vars() 12372 1727204079.77473: Calling all_inventory to load vars for managed-node3 12372 1727204079.77476: Calling groups_inventory to load vars for managed-node3 12372 1727204079.77479: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.77499: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.77504: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.77510: done sending task result for task 12b410aa-8751-244a-02f9-00000000003b 12372 1727204079.77513: WORKER PROCESS EXITING 12372 1727204079.77517: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.77967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.78315: done with get_vars() 12372 1727204079.78329: done getting variables 12372 1727204079.78432: done queuing things up, now waiting for results queue to drain 12372 1727204079.78435: results queue empty 12372 1727204079.78436: checking for any_errors_fatal 12372 1727204079.78439: done checking for any_errors_fatal 12372 1727204079.78440: checking for max_fail_percentage 12372 1727204079.78441: done checking for max_fail_percentage 12372 1727204079.78443: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.78444: done checking to see if all hosts have failed 12372 1727204079.78445: getting the remaining hosts for this loop 12372 1727204079.78446: done getting the remaining hosts for this loop 12372 1727204079.78449: getting the next task for host managed-node3 12372 1727204079.78455: done getting next task for host managed-node3 12372 1727204079.78458: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12372 1727204079.78460: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.78463: getting variables 12372 1727204079.78464: in VariableManager get_vars() 12372 1727204079.78499: Calling all_inventory to load vars for managed-node3 12372 1727204079.78502: Calling groups_inventory to load vars for managed-node3 12372 1727204079.78505: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.78511: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.78520: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.78524: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.78730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.79037: done with get_vars() 12372 1727204079.79047: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.073) 0:00:06.777 ***** 12372 1727204079.79139: entering _queue_task() for managed-node3/include_tasks 12372 1727204079.79448: worker is 1 (out of 1 available) 12372 1727204079.79578: exiting _queue_task() for managed-node3/include_tasks 12372 1727204079.79593: done queuing things up, now waiting for results queue to drain 12372 1727204079.79595: waiting for pending results... 12372 1727204079.79798: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 12372 1727204079.79952: in run() - task 12b410aa-8751-244a-02f9-00000000006e 12372 1727204079.79974: variable 'ansible_search_path' from source: unknown 12372 1727204079.79984: variable 'ansible_search_path' from source: unknown 12372 1727204079.80044: calling self._execute() 12372 1727204079.80154: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.80170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.80188: variable 'omit' from source: magic vars 12372 1727204079.80797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.84035: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.84132: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.84263: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.84266: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.84269: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.84375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.84428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.84464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.84533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.84555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.84739: variable 'ansible_distribution' from source: facts 12372 1727204079.84755: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.84807: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.84818: when evaluation is False, skipping this task 12372 1727204079.84822: _execute() done 12372 1727204079.84825: dumping result to json 12372 1727204079.84827: done dumping result, returning 12372 1727204079.84830: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-244a-02f9-00000000006e] 12372 1727204079.84839: sending task result for task 12b410aa-8751-244a-02f9-00000000006e 12372 1727204079.85196: done sending task result for task 12b410aa-8751-244a-02f9-00000000006e 12372 1727204079.85200: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.85253: no more pending results, returning what we have 12372 1727204079.85257: results queue empty 12372 1727204079.85259: checking for any_errors_fatal 12372 1727204079.85261: done checking for any_errors_fatal 12372 1727204079.85262: checking for max_fail_percentage 12372 1727204079.85264: done checking for max_fail_percentage 12372 1727204079.85265: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.85266: done checking to see if all hosts have failed 12372 1727204079.85267: getting the remaining hosts for this loop 12372 1727204079.85268: done getting the remaining hosts for this loop 12372 1727204079.85273: getting the next task for host managed-node3 12372 1727204079.85279: done getting next task for host managed-node3 12372 1727204079.85282: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12372 1727204079.85286: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.85293: getting variables 12372 1727204079.85294: in VariableManager get_vars() 12372 1727204079.85358: Calling all_inventory to load vars for managed-node3 12372 1727204079.85362: Calling groups_inventory to load vars for managed-node3 12372 1727204079.85365: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.85376: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.85380: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.85384: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.85803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.86117: done with get_vars() 12372 1727204079.86129: done getting variables 12372 1727204079.86204: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12372 1727204079.86350: variable 'interface' from source: task vars 12372 1727204079.86354: variable 'controller_device' from source: play vars 12372 1727204079.86437: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.073) 0:00:06.850 ***** 12372 1727204079.86472: entering _queue_task() for managed-node3/assert 12372 1727204079.86928: worker is 1 (out of 1 available) 12372 1727204079.86939: exiting _queue_task() for managed-node3/assert 12372 1727204079.86951: done queuing things up, now waiting for results queue to drain 12372 1727204079.86952: waiting for pending results... 12372 1727204079.87112: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'nm-bond' 12372 1727204079.87276: in run() - task 12b410aa-8751-244a-02f9-00000000006f 12372 1727204079.87307: variable 'ansible_search_path' from source: unknown 12372 1727204079.87316: variable 'ansible_search_path' from source: unknown 12372 1727204079.87367: calling self._execute() 12372 1727204079.87510: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.87514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.87517: variable 'omit' from source: magic vars 12372 1727204079.88080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.91331: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.91432: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.91491: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.91612: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.91656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.91766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.91818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.91893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.91925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.91948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.92132: variable 'ansible_distribution' from source: facts 12372 1727204079.92227: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.92231: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.92237: when evaluation is False, skipping this task 12372 1727204079.92240: _execute() done 12372 1727204079.92244: dumping result to json 12372 1727204079.92247: done dumping result, returning 12372 1727204079.92250: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'nm-bond' [12b410aa-8751-244a-02f9-00000000006f] 12372 1727204079.92252: sending task result for task 12b410aa-8751-244a-02f9-00000000006f 12372 1727204079.92349: done sending task result for task 12b410aa-8751-244a-02f9-00000000006f 12372 1727204079.92353: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204079.92415: no more pending results, returning what we have 12372 1727204079.92419: results queue empty 12372 1727204079.92420: checking for any_errors_fatal 12372 1727204079.92427: done checking for any_errors_fatal 12372 1727204079.92428: checking for max_fail_percentage 12372 1727204079.92429: done checking for max_fail_percentage 12372 1727204079.92430: checking to see if all hosts have failed and the running result is not ok 12372 1727204079.92431: done checking to see if all hosts have failed 12372 1727204079.92432: getting the remaining hosts for this loop 12372 1727204079.92434: done getting the remaining hosts for this loop 12372 1727204079.92438: getting the next task for host managed-node3 12372 1727204079.92447: done getting next task for host managed-node3 12372 1727204079.92451: ^ task is: TASK: Include the task 'assert_profile_present.yml' 12372 1727204079.92453: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204079.92457: getting variables 12372 1727204079.92458: in VariableManager get_vars() 12372 1727204079.92537: Calling all_inventory to load vars for managed-node3 12372 1727204079.92541: Calling groups_inventory to load vars for managed-node3 12372 1727204079.92543: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204079.92553: Calling all_plugins_play to load vars for managed-node3 12372 1727204079.92556: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204079.92559: Calling groups_plugins_play to load vars for managed-node3 12372 1727204079.92722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204079.92886: done with get_vars() 12372 1727204079.92897: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.065) 0:00:06.915 ***** 12372 1727204079.92974: entering _queue_task() for managed-node3/include_tasks 12372 1727204079.93199: worker is 1 (out of 1 available) 12372 1727204079.93213: exiting _queue_task() for managed-node3/include_tasks 12372 1727204079.93228: done queuing things up, now waiting for results queue to drain 12372 1727204079.93230: waiting for pending results... 12372 1727204079.93408: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' 12372 1727204079.93480: in run() - task 12b410aa-8751-244a-02f9-000000000070 12372 1727204079.93493: variable 'ansible_search_path' from source: unknown 12372 1727204079.93537: variable 'controller_profile' from source: play vars 12372 1727204079.93694: variable 'controller_profile' from source: play vars 12372 1727204079.93707: variable 'port1_profile' from source: play vars 12372 1727204079.93767: variable 'port1_profile' from source: play vars 12372 1727204079.93774: variable 'port2_profile' from source: play vars 12372 1727204079.93834: variable 'port2_profile' from source: play vars 12372 1727204079.93845: variable 'omit' from source: magic vars 12372 1727204079.93955: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.93965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.93977: variable 'omit' from source: magic vars 12372 1727204079.94595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204079.97279: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204079.97369: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204079.97432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204079.97480: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204079.97538: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204079.97659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.97710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.97760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.97830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.97859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.98030: variable 'ansible_distribution' from source: facts 12372 1727204079.98095: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.98099: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.98101: when evaluation is False, skipping this task 12372 1727204079.98125: variable 'item' from source: unknown 12372 1727204079.98225: variable 'item' from source: unknown skipping: [managed-node3] => (item=bond0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0", "skip_reason": "Conditional result was False" } 12372 1727204079.98609: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.98613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.98615: variable 'omit' from source: magic vars 12372 1727204079.98956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204079.98959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204079.98962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204079.99285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204079.99291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204079.99429: variable 'ansible_distribution' from source: facts 12372 1727204079.99432: variable 'ansible_distribution_major_version' from source: facts 12372 1727204079.99441: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204079.99449: when evaluation is False, skipping this task 12372 1727204079.99485: variable 'item' from source: unknown 12372 1727204079.99576: variable 'item' from source: unknown skipping: [managed-node3] => (item=bond0.0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0.0", "skip_reason": "Conditional result was False" } 12372 1727204079.99849: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204079.99907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204079.99911: variable 'omit' from source: magic vars 12372 1727204080.00237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.00284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.00326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.00573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.00577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.00718: variable 'ansible_distribution' from source: facts 12372 1727204080.00726: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.00735: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.00738: when evaluation is False, skipping this task 12372 1727204080.00771: variable 'item' from source: unknown 12372 1727204080.00902: variable 'item' from source: unknown skipping: [managed-node3] => (item=bond0.1) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0.1", "skip_reason": "Conditional result was False" } 12372 1727204080.01058: dumping result to json 12372 1727204080.01061: done dumping result, returning 12372 1727204080.01063: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' [12b410aa-8751-244a-02f9-000000000070] 12372 1727204080.01065: sending task result for task 12b410aa-8751-244a-02f9-000000000070 12372 1727204080.01107: done sending task result for task 12b410aa-8751-244a-02f9-000000000070 12372 1727204080.01109: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false } MSG: All items skipped 12372 1727204080.01205: no more pending results, returning what we have 12372 1727204080.01208: results queue empty 12372 1727204080.01209: checking for any_errors_fatal 12372 1727204080.01215: done checking for any_errors_fatal 12372 1727204080.01216: checking for max_fail_percentage 12372 1727204080.01217: done checking for max_fail_percentage 12372 1727204080.01218: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.01223: done checking to see if all hosts have failed 12372 1727204080.01225: getting the remaining hosts for this loop 12372 1727204080.01226: done getting the remaining hosts for this loop 12372 1727204080.01230: getting the next task for host managed-node3 12372 1727204080.01235: done getting next task for host managed-node3 12372 1727204080.01238: ^ task is: TASK: ** TEST check polling interval 12372 1727204080.01240: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.01244: getting variables 12372 1727204080.01245: in VariableManager get_vars() 12372 1727204080.01306: Calling all_inventory to load vars for managed-node3 12372 1727204080.01310: Calling groups_inventory to load vars for managed-node3 12372 1727204080.01313: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.01324: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.01327: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.01337: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.01648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.01962: done with get_vars() 12372 1727204080.01975: done getting variables 12372 1727204080.02058: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.091) 0:00:07.006 ***** 12372 1727204080.02087: entering _queue_task() for managed-node3/command 12372 1727204080.02319: worker is 1 (out of 1 available) 12372 1727204080.02333: exiting _queue_task() for managed-node3/command 12372 1727204080.02345: done queuing things up, now waiting for results queue to drain 12372 1727204080.02348: waiting for pending results... 12372 1727204080.02551: running TaskExecutor() for managed-node3/TASK: ** TEST check polling interval 12372 1727204080.02638: in run() - task 12b410aa-8751-244a-02f9-000000000071 12372 1727204080.02651: variable 'ansible_search_path' from source: unknown 12372 1727204080.02683: calling self._execute() 12372 1727204080.02760: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.02767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.02778: variable 'omit' from source: magic vars 12372 1727204080.03146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.05446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.05503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.05536: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.05567: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.05594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.05662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.05693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.05713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.05748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.05760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.05869: variable 'ansible_distribution' from source: facts 12372 1727204080.05875: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.05886: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.05892: when evaluation is False, skipping this task 12372 1727204080.05895: _execute() done 12372 1727204080.05897: dumping result to json 12372 1727204080.05906: done dumping result, returning 12372 1727204080.05910: done running TaskExecutor() for managed-node3/TASK: ** TEST check polling interval [12b410aa-8751-244a-02f9-000000000071] 12372 1727204080.05920: sending task result for task 12b410aa-8751-244a-02f9-000000000071 12372 1727204080.06010: done sending task result for task 12b410aa-8751-244a-02f9-000000000071 12372 1727204080.06013: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.06074: no more pending results, returning what we have 12372 1727204080.06077: results queue empty 12372 1727204080.06078: checking for any_errors_fatal 12372 1727204080.06086: done checking for any_errors_fatal 12372 1727204080.06087: checking for max_fail_percentage 12372 1727204080.06092: done checking for max_fail_percentage 12372 1727204080.06093: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.06094: done checking to see if all hosts have failed 12372 1727204080.06095: getting the remaining hosts for this loop 12372 1727204080.06096: done getting the remaining hosts for this loop 12372 1727204080.06101: getting the next task for host managed-node3 12372 1727204080.06107: done getting next task for host managed-node3 12372 1727204080.06110: ^ task is: TASK: ** TEST check IPv4 12372 1727204080.06112: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.06118: getting variables 12372 1727204080.06120: in VariableManager get_vars() 12372 1727204080.06180: Calling all_inventory to load vars for managed-node3 12372 1727204080.06184: Calling groups_inventory to load vars for managed-node3 12372 1727204080.06314: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.06328: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.06331: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.06335: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.06642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.07248: done with get_vars() 12372 1727204080.07260: done getting variables 12372 1727204080.07343: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.052) 0:00:07.059 ***** 12372 1727204080.07374: entering _queue_task() for managed-node3/command 12372 1727204080.07628: worker is 1 (out of 1 available) 12372 1727204080.07643: exiting _queue_task() for managed-node3/command 12372 1727204080.07654: done queuing things up, now waiting for results queue to drain 12372 1727204080.07657: waiting for pending results... 12372 1727204080.08075: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 12372 1727204080.08135: in run() - task 12b410aa-8751-244a-02f9-000000000072 12372 1727204080.08149: variable 'ansible_search_path' from source: unknown 12372 1727204080.08208: calling self._execute() 12372 1727204080.08353: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.08358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.08362: variable 'omit' from source: magic vars 12372 1727204080.08959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.12208: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.12333: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.12364: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.12410: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.12447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.12572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.12591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.12625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.12684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.12767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.12858: variable 'ansible_distribution' from source: facts 12372 1727204080.12862: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.12875: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.12880: when evaluation is False, skipping this task 12372 1727204080.12889: _execute() done 12372 1727204080.12898: dumping result to json 12372 1727204080.12900: done dumping result, returning 12372 1727204080.12907: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 [12b410aa-8751-244a-02f9-000000000072] 12372 1727204080.12913: sending task result for task 12b410aa-8751-244a-02f9-000000000072 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.13061: no more pending results, returning what we have 12372 1727204080.13065: results queue empty 12372 1727204080.13066: checking for any_errors_fatal 12372 1727204080.13074: done checking for any_errors_fatal 12372 1727204080.13075: checking for max_fail_percentage 12372 1727204080.13076: done checking for max_fail_percentage 12372 1727204080.13077: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.13078: done checking to see if all hosts have failed 12372 1727204080.13079: getting the remaining hosts for this loop 12372 1727204080.13080: done getting the remaining hosts for this loop 12372 1727204080.13085: getting the next task for host managed-node3 12372 1727204080.13094: done getting next task for host managed-node3 12372 1727204080.13097: ^ task is: TASK: ** TEST check IPv6 12372 1727204080.13100: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.13104: getting variables 12372 1727204080.13105: in VariableManager get_vars() 12372 1727204080.13162: Calling all_inventory to load vars for managed-node3 12372 1727204080.13165: Calling groups_inventory to load vars for managed-node3 12372 1727204080.13168: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.13178: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.13181: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.13184: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.13201: done sending task result for task 12b410aa-8751-244a-02f9-000000000072 12372 1727204080.13204: WORKER PROCESS EXITING 12372 1727204080.13368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.13535: done with get_vars() 12372 1727204080.13545: done getting variables 12372 1727204080.13595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.062) 0:00:07.121 ***** 12372 1727204080.13618: entering _queue_task() for managed-node3/command 12372 1727204080.13829: worker is 1 (out of 1 available) 12372 1727204080.13844: exiting _queue_task() for managed-node3/command 12372 1727204080.13857: done queuing things up, now waiting for results queue to drain 12372 1727204080.13859: waiting for pending results... 12372 1727204080.14039: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 12372 1727204080.14104: in run() - task 12b410aa-8751-244a-02f9-000000000073 12372 1727204080.14117: variable 'ansible_search_path' from source: unknown 12372 1727204080.14153: calling self._execute() 12372 1727204080.14233: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.14239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.14250: variable 'omit' from source: magic vars 12372 1727204080.14621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.16384: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.16442: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.16473: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.16514: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.16538: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.16609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.16635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.16656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.16688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.16705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.16810: variable 'ansible_distribution' from source: facts 12372 1727204080.16826: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.16829: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.16832: when evaluation is False, skipping this task 12372 1727204080.16835: _execute() done 12372 1727204080.16841: dumping result to json 12372 1727204080.16845: done dumping result, returning 12372 1727204080.16852: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 [12b410aa-8751-244a-02f9-000000000073] 12372 1727204080.16857: sending task result for task 12b410aa-8751-244a-02f9-000000000073 12372 1727204080.16949: done sending task result for task 12b410aa-8751-244a-02f9-000000000073 12372 1727204080.16952: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.17007: no more pending results, returning what we have 12372 1727204080.17011: results queue empty 12372 1727204080.17012: checking for any_errors_fatal 12372 1727204080.17017: done checking for any_errors_fatal 12372 1727204080.17018: checking for max_fail_percentage 12372 1727204080.17020: done checking for max_fail_percentage 12372 1727204080.17021: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.17022: done checking to see if all hosts have failed 12372 1727204080.17023: getting the remaining hosts for this loop 12372 1727204080.17025: done getting the remaining hosts for this loop 12372 1727204080.17029: getting the next task for host managed-node3 12372 1727204080.17036: done getting next task for host managed-node3 12372 1727204080.17041: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204080.17044: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.17060: getting variables 12372 1727204080.17061: in VariableManager get_vars() 12372 1727204080.17113: Calling all_inventory to load vars for managed-node3 12372 1727204080.17116: Calling groups_inventory to load vars for managed-node3 12372 1727204080.17119: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.17128: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.17131: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.17134: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.17321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.17492: done with get_vars() 12372 1727204080.17501: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.039) 0:00:07.161 ***** 12372 1727204080.17576: entering _queue_task() for managed-node3/include_tasks 12372 1727204080.17769: worker is 1 (out of 1 available) 12372 1727204080.17785: exiting _queue_task() for managed-node3/include_tasks 12372 1727204080.17798: done queuing things up, now waiting for results queue to drain 12372 1727204080.17800: waiting for pending results... 12372 1727204080.17978: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204080.18082: in run() - task 12b410aa-8751-244a-02f9-00000000007b 12372 1727204080.18096: variable 'ansible_search_path' from source: unknown 12372 1727204080.18101: variable 'ansible_search_path' from source: unknown 12372 1727204080.18138: calling self._execute() 12372 1727204080.18205: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.18211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.18223: variable 'omit' from source: magic vars 12372 1727204080.18581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.20298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.20362: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.20394: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.20427: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.20451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.20521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.20547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.20568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.20602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.20614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.20725: variable 'ansible_distribution' from source: facts 12372 1727204080.20730: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.20742: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.20745: when evaluation is False, skipping this task 12372 1727204080.20748: _execute() done 12372 1727204080.20751: dumping result to json 12372 1727204080.20763: done dumping result, returning 12372 1727204080.20767: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-244a-02f9-00000000007b] 12372 1727204080.20770: sending task result for task 12b410aa-8751-244a-02f9-00000000007b 12372 1727204080.20856: done sending task result for task 12b410aa-8751-244a-02f9-00000000007b 12372 1727204080.20859: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.20911: no more pending results, returning what we have 12372 1727204080.20918: results queue empty 12372 1727204080.20919: checking for any_errors_fatal 12372 1727204080.20924: done checking for any_errors_fatal 12372 1727204080.20925: checking for max_fail_percentage 12372 1727204080.20926: done checking for max_fail_percentage 12372 1727204080.20927: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.20929: done checking to see if all hosts have failed 12372 1727204080.20929: getting the remaining hosts for this loop 12372 1727204080.20931: done getting the remaining hosts for this loop 12372 1727204080.20935: getting the next task for host managed-node3 12372 1727204080.20942: done getting next task for host managed-node3 12372 1727204080.20946: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204080.20949: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.20973: getting variables 12372 1727204080.20975: in VariableManager get_vars() 12372 1727204080.21027: Calling all_inventory to load vars for managed-node3 12372 1727204080.21031: Calling groups_inventory to load vars for managed-node3 12372 1727204080.21033: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.21042: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.21046: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.21049: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.21205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.21379: done with get_vars() 12372 1727204080.21392: done getting variables 12372 1727204080.21442: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.038) 0:00:07.200 ***** 12372 1727204080.21469: entering _queue_task() for managed-node3/debug 12372 1727204080.21686: worker is 1 (out of 1 available) 12372 1727204080.21701: exiting _queue_task() for managed-node3/debug 12372 1727204080.21714: done queuing things up, now waiting for results queue to drain 12372 1727204080.21719: waiting for pending results... 12372 1727204080.21893: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204080.22009: in run() - task 12b410aa-8751-244a-02f9-00000000007c 12372 1727204080.22017: variable 'ansible_search_path' from source: unknown 12372 1727204080.22023: variable 'ansible_search_path' from source: unknown 12372 1727204080.22060: calling self._execute() 12372 1727204080.22138: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.22146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.22158: variable 'omit' from source: magic vars 12372 1727204080.22535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.24388: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.24443: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.24477: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.24509: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.24534: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.24606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.24631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.24652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.24688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.24704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.24819: variable 'ansible_distribution' from source: facts 12372 1727204080.24823: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.24833: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.24836: when evaluation is False, skipping this task 12372 1727204080.24841: _execute() done 12372 1727204080.24844: dumping result to json 12372 1727204080.24849: done dumping result, returning 12372 1727204080.24858: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-244a-02f9-00000000007c] 12372 1727204080.24864: sending task result for task 12b410aa-8751-244a-02f9-00000000007c 12372 1727204080.24963: done sending task result for task 12b410aa-8751-244a-02f9-00000000007c 12372 1727204080.24966: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204080.25041: no more pending results, returning what we have 12372 1727204080.25045: results queue empty 12372 1727204080.25046: checking for any_errors_fatal 12372 1727204080.25051: done checking for any_errors_fatal 12372 1727204080.25052: checking for max_fail_percentage 12372 1727204080.25054: done checking for max_fail_percentage 12372 1727204080.25055: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.25056: done checking to see if all hosts have failed 12372 1727204080.25057: getting the remaining hosts for this loop 12372 1727204080.25058: done getting the remaining hosts for this loop 12372 1727204080.25063: getting the next task for host managed-node3 12372 1727204080.25069: done getting next task for host managed-node3 12372 1727204080.25074: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204080.25084: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.25104: getting variables 12372 1727204080.25106: in VariableManager get_vars() 12372 1727204080.25161: Calling all_inventory to load vars for managed-node3 12372 1727204080.25164: Calling groups_inventory to load vars for managed-node3 12372 1727204080.25167: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.25177: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.25180: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.25183: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.25378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.25556: done with get_vars() 12372 1727204080.25565: done getting variables 12372 1727204080.25613: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.041) 0:00:07.242 ***** 12372 1727204080.25644: entering _queue_task() for managed-node3/fail 12372 1727204080.25861: worker is 1 (out of 1 available) 12372 1727204080.25876: exiting _queue_task() for managed-node3/fail 12372 1727204080.25887: done queuing things up, now waiting for results queue to drain 12372 1727204080.25891: waiting for pending results... 12372 1727204080.26067: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204080.26165: in run() - task 12b410aa-8751-244a-02f9-00000000007d 12372 1727204080.26177: variable 'ansible_search_path' from source: unknown 12372 1727204080.26181: variable 'ansible_search_path' from source: unknown 12372 1727204080.26218: calling self._execute() 12372 1727204080.26292: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.26299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.26309: variable 'omit' from source: magic vars 12372 1727204080.26670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.28472: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.28530: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.28560: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.28591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.28614: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.28682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.28708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.28731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.28769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.28781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.28904: variable 'ansible_distribution' from source: facts 12372 1727204080.28910: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.28967: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.28971: when evaluation is False, skipping this task 12372 1727204080.28974: _execute() done 12372 1727204080.28976: dumping result to json 12372 1727204080.28978: done dumping result, returning 12372 1727204080.28981: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-244a-02f9-00000000007d] 12372 1727204080.28983: sending task result for task 12b410aa-8751-244a-02f9-00000000007d 12372 1727204080.29055: done sending task result for task 12b410aa-8751-244a-02f9-00000000007d 12372 1727204080.29058: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.29110: no more pending results, returning what we have 12372 1727204080.29114: results queue empty 12372 1727204080.29121: checking for any_errors_fatal 12372 1727204080.29128: done checking for any_errors_fatal 12372 1727204080.29129: checking for max_fail_percentage 12372 1727204080.29131: done checking for max_fail_percentage 12372 1727204080.29132: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.29133: done checking to see if all hosts have failed 12372 1727204080.29134: getting the remaining hosts for this loop 12372 1727204080.29135: done getting the remaining hosts for this loop 12372 1727204080.29139: getting the next task for host managed-node3 12372 1727204080.29145: done getting next task for host managed-node3 12372 1727204080.29149: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204080.29153: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.29169: getting variables 12372 1727204080.29170: in VariableManager get_vars() 12372 1727204080.29241: Calling all_inventory to load vars for managed-node3 12372 1727204080.29243: Calling groups_inventory to load vars for managed-node3 12372 1727204080.29245: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.29253: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.29255: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.29257: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.29413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.29585: done with get_vars() 12372 1727204080.29596: done getting variables 12372 1727204080.29646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.040) 0:00:07.282 ***** 12372 1727204080.29671: entering _queue_task() for managed-node3/fail 12372 1727204080.29880: worker is 1 (out of 1 available) 12372 1727204080.29898: exiting _queue_task() for managed-node3/fail 12372 1727204080.29912: done queuing things up, now waiting for results queue to drain 12372 1727204080.29914: waiting for pending results... 12372 1727204080.30103: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204080.30200: in run() - task 12b410aa-8751-244a-02f9-00000000007e 12372 1727204080.30213: variable 'ansible_search_path' from source: unknown 12372 1727204080.30216: variable 'ansible_search_path' from source: unknown 12372 1727204080.30255: calling self._execute() 12372 1727204080.30334: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.30341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.30353: variable 'omit' from source: magic vars 12372 1727204080.30810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.33496: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.33699: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.33703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.33706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.33738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.33840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.33925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.33977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.34039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.34074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.34244: variable 'ansible_distribution' from source: facts 12372 1727204080.34258: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.34276: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.34300: when evaluation is False, skipping this task 12372 1727204080.34405: _execute() done 12372 1727204080.34408: dumping result to json 12372 1727204080.34411: done dumping result, returning 12372 1727204080.34414: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-244a-02f9-00000000007e] 12372 1727204080.34416: sending task result for task 12b410aa-8751-244a-02f9-00000000007e 12372 1727204080.34493: done sending task result for task 12b410aa-8751-244a-02f9-00000000007e 12372 1727204080.34498: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.34550: no more pending results, returning what we have 12372 1727204080.34555: results queue empty 12372 1727204080.34556: checking for any_errors_fatal 12372 1727204080.34562: done checking for any_errors_fatal 12372 1727204080.34563: checking for max_fail_percentage 12372 1727204080.34566: done checking for max_fail_percentage 12372 1727204080.34567: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.34568: done checking to see if all hosts have failed 12372 1727204080.34569: getting the remaining hosts for this loop 12372 1727204080.34570: done getting the remaining hosts for this loop 12372 1727204080.34575: getting the next task for host managed-node3 12372 1727204080.34583: done getting next task for host managed-node3 12372 1727204080.34588: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204080.34594: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.34615: getting variables 12372 1727204080.34617: in VariableManager get_vars() 12372 1727204080.34678: Calling all_inventory to load vars for managed-node3 12372 1727204080.34682: Calling groups_inventory to load vars for managed-node3 12372 1727204080.34685: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.34814: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.34818: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.34823: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.35277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.35599: done with get_vars() 12372 1727204080.35613: done getting variables 12372 1727204080.35688: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.060) 0:00:07.342 ***** 12372 1727204080.35728: entering _queue_task() for managed-node3/fail 12372 1727204080.36117: worker is 1 (out of 1 available) 12372 1727204080.36130: exiting _queue_task() for managed-node3/fail 12372 1727204080.36141: done queuing things up, now waiting for results queue to drain 12372 1727204080.36143: waiting for pending results... 12372 1727204080.36573: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204080.36578: in run() - task 12b410aa-8751-244a-02f9-00000000007f 12372 1727204080.36581: variable 'ansible_search_path' from source: unknown 12372 1727204080.36584: variable 'ansible_search_path' from source: unknown 12372 1727204080.36612: calling self._execute() 12372 1727204080.36721: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.36736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.36755: variable 'omit' from source: magic vars 12372 1727204080.37327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.40251: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.40353: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.40414: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.40490: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.40511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.40632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.40674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.40794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.40798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.40813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.40996: variable 'ansible_distribution' from source: facts 12372 1727204080.41010: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.41028: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.41049: when evaluation is False, skipping this task 12372 1727204080.41061: _execute() done 12372 1727204080.41070: dumping result to json 12372 1727204080.41079: done dumping result, returning 12372 1727204080.41149: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-244a-02f9-00000000007f] 12372 1727204080.41155: sending task result for task 12b410aa-8751-244a-02f9-00000000007f 12372 1727204080.41236: done sending task result for task 12b410aa-8751-244a-02f9-00000000007f 12372 1727204080.41240: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.41306: no more pending results, returning what we have 12372 1727204080.41310: results queue empty 12372 1727204080.41311: checking for any_errors_fatal 12372 1727204080.41318: done checking for any_errors_fatal 12372 1727204080.41319: checking for max_fail_percentage 12372 1727204080.41322: done checking for max_fail_percentage 12372 1727204080.41323: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.41324: done checking to see if all hosts have failed 12372 1727204080.41325: getting the remaining hosts for this loop 12372 1727204080.41327: done getting the remaining hosts for this loop 12372 1727204080.41331: getting the next task for host managed-node3 12372 1727204080.41340: done getting next task for host managed-node3 12372 1727204080.41344: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204080.41348: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.41370: getting variables 12372 1727204080.41373: in VariableManager get_vars() 12372 1727204080.41435: Calling all_inventory to load vars for managed-node3 12372 1727204080.41439: Calling groups_inventory to load vars for managed-node3 12372 1727204080.41442: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.41454: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.41458: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.41462: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.41988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.42324: done with get_vars() 12372 1727204080.42338: done getting variables 12372 1727204080.42414: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.067) 0:00:07.410 ***** 12372 1727204080.42448: entering _queue_task() for managed-node3/dnf 12372 1727204080.42826: worker is 1 (out of 1 available) 12372 1727204080.42838: exiting _queue_task() for managed-node3/dnf 12372 1727204080.42850: done queuing things up, now waiting for results queue to drain 12372 1727204080.42851: waiting for pending results... 12372 1727204080.43039: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204080.43137: in run() - task 12b410aa-8751-244a-02f9-000000000080 12372 1727204080.43148: variable 'ansible_search_path' from source: unknown 12372 1727204080.43152: variable 'ansible_search_path' from source: unknown 12372 1727204080.43189: calling self._execute() 12372 1727204080.43269: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.43275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.43286: variable 'omit' from source: magic vars 12372 1727204080.43698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.45694: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.45698: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.45700: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.45702: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.45728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.45818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.45856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.45892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.45945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.45966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.46121: variable 'ansible_distribution' from source: facts 12372 1727204080.46125: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.46135: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.46146: when evaluation is False, skipping this task 12372 1727204080.46153: _execute() done 12372 1727204080.46156: dumping result to json 12372 1727204080.46159: done dumping result, returning 12372 1727204080.46162: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000080] 12372 1727204080.46164: sending task result for task 12b410aa-8751-244a-02f9-000000000080 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.46318: no more pending results, returning what we have 12372 1727204080.46321: results queue empty 12372 1727204080.46322: checking for any_errors_fatal 12372 1727204080.46329: done checking for any_errors_fatal 12372 1727204080.46330: checking for max_fail_percentage 12372 1727204080.46332: done checking for max_fail_percentage 12372 1727204080.46333: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.46334: done checking to see if all hosts have failed 12372 1727204080.46335: getting the remaining hosts for this loop 12372 1727204080.46337: done getting the remaining hosts for this loop 12372 1727204080.46340: getting the next task for host managed-node3 12372 1727204080.46346: done getting next task for host managed-node3 12372 1727204080.46351: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204080.46359: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.46380: getting variables 12372 1727204080.46382: in VariableManager get_vars() 12372 1727204080.46483: Calling all_inventory to load vars for managed-node3 12372 1727204080.46485: Calling groups_inventory to load vars for managed-node3 12372 1727204080.46487: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.46494: done sending task result for task 12b410aa-8751-244a-02f9-000000000080 12372 1727204080.46498: WORKER PROCESS EXITING 12372 1727204080.46505: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.46508: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.46510: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.46640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.46806: done with get_vars() 12372 1727204080.46814: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204080.46872: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.044) 0:00:07.454 ***** 12372 1727204080.46898: entering _queue_task() for managed-node3/yum 12372 1727204080.47096: worker is 1 (out of 1 available) 12372 1727204080.47112: exiting _queue_task() for managed-node3/yum 12372 1727204080.47126: done queuing things up, now waiting for results queue to drain 12372 1727204080.47128: waiting for pending results... 12372 1727204080.47294: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204080.47397: in run() - task 12b410aa-8751-244a-02f9-000000000081 12372 1727204080.47409: variable 'ansible_search_path' from source: unknown 12372 1727204080.47412: variable 'ansible_search_path' from source: unknown 12372 1727204080.47445: calling self._execute() 12372 1727204080.47520: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.47524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.47535: variable 'omit' from source: magic vars 12372 1727204080.47881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.49597: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.49652: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.49687: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.49723: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.49745: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.49820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.49847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.49870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.49907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.49922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.50037: variable 'ansible_distribution' from source: facts 12372 1727204080.50042: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.50053: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.50057: when evaluation is False, skipping this task 12372 1727204080.50060: _execute() done 12372 1727204080.50062: dumping result to json 12372 1727204080.50069: done dumping result, returning 12372 1727204080.50078: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000081] 12372 1727204080.50084: sending task result for task 12b410aa-8751-244a-02f9-000000000081 12372 1727204080.50177: done sending task result for task 12b410aa-8751-244a-02f9-000000000081 12372 1727204080.50179: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.50239: no more pending results, returning what we have 12372 1727204080.50243: results queue empty 12372 1727204080.50244: checking for any_errors_fatal 12372 1727204080.50251: done checking for any_errors_fatal 12372 1727204080.50252: checking for max_fail_percentage 12372 1727204080.50254: done checking for max_fail_percentage 12372 1727204080.50255: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.50256: done checking to see if all hosts have failed 12372 1727204080.50257: getting the remaining hosts for this loop 12372 1727204080.50258: done getting the remaining hosts for this loop 12372 1727204080.50262: getting the next task for host managed-node3 12372 1727204080.50269: done getting next task for host managed-node3 12372 1727204080.50273: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204080.50276: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.50294: getting variables 12372 1727204080.50296: in VariableManager get_vars() 12372 1727204080.50346: Calling all_inventory to load vars for managed-node3 12372 1727204080.50350: Calling groups_inventory to load vars for managed-node3 12372 1727204080.50353: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.50362: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.50365: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.50368: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.50515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.50709: done with get_vars() 12372 1727204080.50718: done getting variables 12372 1727204080.50766: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.038) 0:00:07.493 ***** 12372 1727204080.50794: entering _queue_task() for managed-node3/fail 12372 1727204080.51006: worker is 1 (out of 1 available) 12372 1727204080.51022: exiting _queue_task() for managed-node3/fail 12372 1727204080.51035: done queuing things up, now waiting for results queue to drain 12372 1727204080.51037: waiting for pending results... 12372 1727204080.51211: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204080.51312: in run() - task 12b410aa-8751-244a-02f9-000000000082 12372 1727204080.51329: variable 'ansible_search_path' from source: unknown 12372 1727204080.51332: variable 'ansible_search_path' from source: unknown 12372 1727204080.51362: calling self._execute() 12372 1727204080.51438: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.51445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.51455: variable 'omit' from source: magic vars 12372 1727204080.51996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.54193: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.54256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.54290: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.54325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.54349: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.54424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.54449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.54470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.54510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.54523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.54639: variable 'ansible_distribution' from source: facts 12372 1727204080.54645: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.54657: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.54660: when evaluation is False, skipping this task 12372 1727204080.54662: _execute() done 12372 1727204080.54666: dumping result to json 12372 1727204080.54671: done dumping result, returning 12372 1727204080.54680: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000082] 12372 1727204080.54685: sending task result for task 12b410aa-8751-244a-02f9-000000000082 12372 1727204080.54784: done sending task result for task 12b410aa-8751-244a-02f9-000000000082 12372 1727204080.54787: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.54846: no more pending results, returning what we have 12372 1727204080.54850: results queue empty 12372 1727204080.54851: checking for any_errors_fatal 12372 1727204080.54859: done checking for any_errors_fatal 12372 1727204080.54860: checking for max_fail_percentage 12372 1727204080.54861: done checking for max_fail_percentage 12372 1727204080.54862: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.54863: done checking to see if all hosts have failed 12372 1727204080.54864: getting the remaining hosts for this loop 12372 1727204080.54866: done getting the remaining hosts for this loop 12372 1727204080.54870: getting the next task for host managed-node3 12372 1727204080.54877: done getting next task for host managed-node3 12372 1727204080.54882: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12372 1727204080.54885: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.54906: getting variables 12372 1727204080.54908: in VariableManager get_vars() 12372 1727204080.54959: Calling all_inventory to load vars for managed-node3 12372 1727204080.54962: Calling groups_inventory to load vars for managed-node3 12372 1727204080.54964: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.54973: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.54976: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.54979: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.55138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.55316: done with get_vars() 12372 1727204080.55328: done getting variables 12372 1727204080.55375: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.046) 0:00:07.539 ***** 12372 1727204080.55403: entering _queue_task() for managed-node3/package 12372 1727204080.55621: worker is 1 (out of 1 available) 12372 1727204080.55636: exiting _queue_task() for managed-node3/package 12372 1727204080.55648: done queuing things up, now waiting for results queue to drain 12372 1727204080.55650: waiting for pending results... 12372 1727204080.55832: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 12372 1727204080.55937: in run() - task 12b410aa-8751-244a-02f9-000000000083 12372 1727204080.55949: variable 'ansible_search_path' from source: unknown 12372 1727204080.55952: variable 'ansible_search_path' from source: unknown 12372 1727204080.55985: calling self._execute() 12372 1727204080.56056: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.56063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.56072: variable 'omit' from source: magic vars 12372 1727204080.56440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.58200: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.58260: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.58293: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.58329: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.58352: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.58424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.58448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.58469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.58503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.58523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.58633: variable 'ansible_distribution' from source: facts 12372 1727204080.58637: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.58645: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.58648: when evaluation is False, skipping this task 12372 1727204080.58650: _execute() done 12372 1727204080.58656: dumping result to json 12372 1727204080.58660: done dumping result, returning 12372 1727204080.58669: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-244a-02f9-000000000083] 12372 1727204080.58674: sending task result for task 12b410aa-8751-244a-02f9-000000000083 12372 1727204080.58767: done sending task result for task 12b410aa-8751-244a-02f9-000000000083 12372 1727204080.58770: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.58829: no more pending results, returning what we have 12372 1727204080.58833: results queue empty 12372 1727204080.58834: checking for any_errors_fatal 12372 1727204080.58841: done checking for any_errors_fatal 12372 1727204080.58842: checking for max_fail_percentage 12372 1727204080.58843: done checking for max_fail_percentage 12372 1727204080.58845: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.58846: done checking to see if all hosts have failed 12372 1727204080.58847: getting the remaining hosts for this loop 12372 1727204080.58848: done getting the remaining hosts for this loop 12372 1727204080.58852: getting the next task for host managed-node3 12372 1727204080.58858: done getting next task for host managed-node3 12372 1727204080.58862: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204080.58865: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.58883: getting variables 12372 1727204080.58885: in VariableManager get_vars() 12372 1727204080.58934: Calling all_inventory to load vars for managed-node3 12372 1727204080.58937: Calling groups_inventory to load vars for managed-node3 12372 1727204080.58940: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.58949: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.58952: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.58959: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.59131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.59299: done with get_vars() 12372 1727204080.59307: done getting variables 12372 1727204080.59351: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.039) 0:00:07.579 ***** 12372 1727204080.59376: entering _queue_task() for managed-node3/package 12372 1727204080.59576: worker is 1 (out of 1 available) 12372 1727204080.59594: exiting _queue_task() for managed-node3/package 12372 1727204080.59607: done queuing things up, now waiting for results queue to drain 12372 1727204080.59609: waiting for pending results... 12372 1727204080.59776: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204080.59883: in run() - task 12b410aa-8751-244a-02f9-000000000084 12372 1727204080.59900: variable 'ansible_search_path' from source: unknown 12372 1727204080.59904: variable 'ansible_search_path' from source: unknown 12372 1727204080.59938: calling self._execute() 12372 1727204080.60010: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.60019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.60027: variable 'omit' from source: magic vars 12372 1727204080.60388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.62088: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.62152: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.62182: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.62213: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.62239: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.62308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.62333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.62355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.62396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.62410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.62524: variable 'ansible_distribution' from source: facts 12372 1727204080.62531: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.62541: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.62544: when evaluation is False, skipping this task 12372 1727204080.62547: _execute() done 12372 1727204080.62551: dumping result to json 12372 1727204080.62556: done dumping result, returning 12372 1727204080.62563: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-244a-02f9-000000000084] 12372 1727204080.62570: sending task result for task 12b410aa-8751-244a-02f9-000000000084 12372 1727204080.62665: done sending task result for task 12b410aa-8751-244a-02f9-000000000084 12372 1727204080.62668: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.62738: no more pending results, returning what we have 12372 1727204080.62741: results queue empty 12372 1727204080.62742: checking for any_errors_fatal 12372 1727204080.62747: done checking for any_errors_fatal 12372 1727204080.62749: checking for max_fail_percentage 12372 1727204080.62750: done checking for max_fail_percentage 12372 1727204080.62751: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.62753: done checking to see if all hosts have failed 12372 1727204080.62753: getting the remaining hosts for this loop 12372 1727204080.62755: done getting the remaining hosts for this loop 12372 1727204080.62758: getting the next task for host managed-node3 12372 1727204080.62764: done getting next task for host managed-node3 12372 1727204080.62767: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204080.62771: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.62786: getting variables 12372 1727204080.62788: in VariableManager get_vars() 12372 1727204080.62837: Calling all_inventory to load vars for managed-node3 12372 1727204080.62840: Calling groups_inventory to load vars for managed-node3 12372 1727204080.62843: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.62851: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.62853: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.62855: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.62997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.63167: done with get_vars() 12372 1727204080.63177: done getting variables 12372 1727204080.63226: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.038) 0:00:07.618 ***** 12372 1727204080.63252: entering _queue_task() for managed-node3/package 12372 1727204080.63447: worker is 1 (out of 1 available) 12372 1727204080.63461: exiting _queue_task() for managed-node3/package 12372 1727204080.63474: done queuing things up, now waiting for results queue to drain 12372 1727204080.63476: waiting for pending results... 12372 1727204080.63655: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204080.63753: in run() - task 12b410aa-8751-244a-02f9-000000000085 12372 1727204080.63764: variable 'ansible_search_path' from source: unknown 12372 1727204080.63768: variable 'ansible_search_path' from source: unknown 12372 1727204080.63801: calling self._execute() 12372 1727204080.63871: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.63884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.63941: variable 'omit' from source: magic vars 12372 1727204080.64260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.65991: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.66046: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.66077: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.66109: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.66137: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.66202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.66229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.66254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.66285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.66300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.66413: variable 'ansible_distribution' from source: facts 12372 1727204080.66419: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.66429: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.66432: when evaluation is False, skipping this task 12372 1727204080.66435: _execute() done 12372 1727204080.66439: dumping result to json 12372 1727204080.66444: done dumping result, returning 12372 1727204080.66456: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-244a-02f9-000000000085] 12372 1727204080.66460: sending task result for task 12b410aa-8751-244a-02f9-000000000085 12372 1727204080.66554: done sending task result for task 12b410aa-8751-244a-02f9-000000000085 12372 1727204080.66559: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.66611: no more pending results, returning what we have 12372 1727204080.66614: results queue empty 12372 1727204080.66615: checking for any_errors_fatal 12372 1727204080.66623: done checking for any_errors_fatal 12372 1727204080.66624: checking for max_fail_percentage 12372 1727204080.66625: done checking for max_fail_percentage 12372 1727204080.66627: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.66628: done checking to see if all hosts have failed 12372 1727204080.66628: getting the remaining hosts for this loop 12372 1727204080.66630: done getting the remaining hosts for this loop 12372 1727204080.66634: getting the next task for host managed-node3 12372 1727204080.66640: done getting next task for host managed-node3 12372 1727204080.66645: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204080.66648: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.66665: getting variables 12372 1727204080.66666: in VariableManager get_vars() 12372 1727204080.66727: Calling all_inventory to load vars for managed-node3 12372 1727204080.66730: Calling groups_inventory to load vars for managed-node3 12372 1727204080.66733: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.66742: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.66744: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.66747: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.66927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.67095: done with get_vars() 12372 1727204080.67104: done getting variables 12372 1727204080.67153: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.039) 0:00:07.657 ***** 12372 1727204080.67178: entering _queue_task() for managed-node3/service 12372 1727204080.67379: worker is 1 (out of 1 available) 12372 1727204080.67398: exiting _queue_task() for managed-node3/service 12372 1727204080.67410: done queuing things up, now waiting for results queue to drain 12372 1727204080.67412: waiting for pending results... 12372 1727204080.67580: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204080.67677: in run() - task 12b410aa-8751-244a-02f9-000000000086 12372 1727204080.67691: variable 'ansible_search_path' from source: unknown 12372 1727204080.67696: variable 'ansible_search_path' from source: unknown 12372 1727204080.67728: calling self._execute() 12372 1727204080.67798: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.67805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.67815: variable 'omit' from source: magic vars 12372 1727204080.68179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.70284: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.70373: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.70424: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.70472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.70511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.70620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.70664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.70708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.70766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.70895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.70958: variable 'ansible_distribution' from source: facts 12372 1727204080.70971: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.70988: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.71047: when evaluation is False, skipping this task 12372 1727204080.71056: _execute() done 12372 1727204080.71064: dumping result to json 12372 1727204080.71073: done dumping result, returning 12372 1727204080.71087: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000086] 12372 1727204080.71102: sending task result for task 12b410aa-8751-244a-02f9-000000000086 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.71388: no more pending results, returning what we have 12372 1727204080.71395: results queue empty 12372 1727204080.71396: checking for any_errors_fatal 12372 1727204080.71404: done checking for any_errors_fatal 12372 1727204080.71404: checking for max_fail_percentage 12372 1727204080.71406: done checking for max_fail_percentage 12372 1727204080.71407: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.71408: done checking to see if all hosts have failed 12372 1727204080.71410: getting the remaining hosts for this loop 12372 1727204080.71411: done getting the remaining hosts for this loop 12372 1727204080.71419: getting the next task for host managed-node3 12372 1727204080.71427: done getting next task for host managed-node3 12372 1727204080.71432: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204080.71436: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.71457: getting variables 12372 1727204080.71460: in VariableManager get_vars() 12372 1727204080.71640: Calling all_inventory to load vars for managed-node3 12372 1727204080.71645: Calling groups_inventory to load vars for managed-node3 12372 1727204080.71648: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.71730: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.71734: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.71739: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.72029: done sending task result for task 12b410aa-8751-244a-02f9-000000000086 12372 1727204080.72034: WORKER PROCESS EXITING 12372 1727204080.72070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.72410: done with get_vars() 12372 1727204080.72426: done getting variables 12372 1727204080.72509: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.053) 0:00:07.711 ***** 12372 1727204080.72550: entering _queue_task() for managed-node3/service 12372 1727204080.72965: worker is 1 (out of 1 available) 12372 1727204080.72978: exiting _queue_task() for managed-node3/service 12372 1727204080.72992: done queuing things up, now waiting for results queue to drain 12372 1727204080.72994: waiting for pending results... 12372 1727204080.73226: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204080.73405: in run() - task 12b410aa-8751-244a-02f9-000000000087 12372 1727204080.73431: variable 'ansible_search_path' from source: unknown 12372 1727204080.73450: variable 'ansible_search_path' from source: unknown 12372 1727204080.73501: calling self._execute() 12372 1727204080.73597: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.73605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.73620: variable 'omit' from source: magic vars 12372 1727204080.74062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.75933: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.75990: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.76024: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.76080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.76298: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.76302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.76305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.76307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.76332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.76352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.76519: variable 'ansible_distribution' from source: facts 12372 1727204080.76533: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.76548: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.76555: when evaluation is False, skipping this task 12372 1727204080.76561: _execute() done 12372 1727204080.76567: dumping result to json 12372 1727204080.76574: done dumping result, returning 12372 1727204080.76585: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-244a-02f9-000000000087] 12372 1727204080.76596: sending task result for task 12b410aa-8751-244a-02f9-000000000087 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204080.76748: no more pending results, returning what we have 12372 1727204080.76752: results queue empty 12372 1727204080.76753: checking for any_errors_fatal 12372 1727204080.76760: done checking for any_errors_fatal 12372 1727204080.76761: checking for max_fail_percentage 12372 1727204080.76762: done checking for max_fail_percentage 12372 1727204080.76763: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.76764: done checking to see if all hosts have failed 12372 1727204080.76765: getting the remaining hosts for this loop 12372 1727204080.76767: done getting the remaining hosts for this loop 12372 1727204080.76771: getting the next task for host managed-node3 12372 1727204080.76778: done getting next task for host managed-node3 12372 1727204080.76782: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204080.76786: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.76804: getting variables 12372 1727204080.76806: in VariableManager get_vars() 12372 1727204080.76868: Calling all_inventory to load vars for managed-node3 12372 1727204080.76872: Calling groups_inventory to load vars for managed-node3 12372 1727204080.76874: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.76884: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.76887: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.76896: done sending task result for task 12b410aa-8751-244a-02f9-000000000087 12372 1727204080.76899: WORKER PROCESS EXITING 12372 1727204080.76903: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.77113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.77284: done with get_vars() 12372 1727204080.77295: done getting variables 12372 1727204080.77342: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.048) 0:00:07.759 ***** 12372 1727204080.77367: entering _queue_task() for managed-node3/service 12372 1727204080.77576: worker is 1 (out of 1 available) 12372 1727204080.77594: exiting _queue_task() for managed-node3/service 12372 1727204080.77606: done queuing things up, now waiting for results queue to drain 12372 1727204080.77608: waiting for pending results... 12372 1727204080.77781: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204080.77879: in run() - task 12b410aa-8751-244a-02f9-000000000088 12372 1727204080.77892: variable 'ansible_search_path' from source: unknown 12372 1727204080.77897: variable 'ansible_search_path' from source: unknown 12372 1727204080.77930: calling self._execute() 12372 1727204080.78000: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.78006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.78020: variable 'omit' from source: magic vars 12372 1727204080.78375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.80079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.80136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.80167: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.80198: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.80225: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.80291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.80315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.80340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.80377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.80391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.80503: variable 'ansible_distribution' from source: facts 12372 1727204080.80509: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.80521: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.80524: when evaluation is False, skipping this task 12372 1727204080.80527: _execute() done 12372 1727204080.80530: dumping result to json 12372 1727204080.80533: done dumping result, returning 12372 1727204080.80542: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-244a-02f9-000000000088] 12372 1727204080.80549: sending task result for task 12b410aa-8751-244a-02f9-000000000088 12372 1727204080.80644: done sending task result for task 12b410aa-8751-244a-02f9-000000000088 12372 1727204080.80647: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.80703: no more pending results, returning what we have 12372 1727204080.80707: results queue empty 12372 1727204080.80708: checking for any_errors_fatal 12372 1727204080.80714: done checking for any_errors_fatal 12372 1727204080.80715: checking for max_fail_percentage 12372 1727204080.80719: done checking for max_fail_percentage 12372 1727204080.80720: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.80721: done checking to see if all hosts have failed 12372 1727204080.80722: getting the remaining hosts for this loop 12372 1727204080.80723: done getting the remaining hosts for this loop 12372 1727204080.80728: getting the next task for host managed-node3 12372 1727204080.80734: done getting next task for host managed-node3 12372 1727204080.80739: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204080.80742: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.80758: getting variables 12372 1727204080.80760: in VariableManager get_vars() 12372 1727204080.80811: Calling all_inventory to load vars for managed-node3 12372 1727204080.80814: Calling groups_inventory to load vars for managed-node3 12372 1727204080.80819: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.80829: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.80836: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.80839: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.80979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.81164: done with get_vars() 12372 1727204080.81173: done getting variables 12372 1727204080.81222: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.038) 0:00:07.798 ***** 12372 1727204080.81248: entering _queue_task() for managed-node3/service 12372 1727204080.81466: worker is 1 (out of 1 available) 12372 1727204080.81481: exiting _queue_task() for managed-node3/service 12372 1727204080.81496: done queuing things up, now waiting for results queue to drain 12372 1727204080.81498: waiting for pending results... 12372 1727204080.81667: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204080.81768: in run() - task 12b410aa-8751-244a-02f9-000000000089 12372 1727204080.81780: variable 'ansible_search_path' from source: unknown 12372 1727204080.81784: variable 'ansible_search_path' from source: unknown 12372 1727204080.81828: calling self._execute() 12372 1727204080.81904: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.81911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.81924: variable 'omit' from source: magic vars 12372 1727204080.82351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.84586: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.84994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.84997: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.84999: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.85030: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.85125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.85150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.85177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.85214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.85229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.85341: variable 'ansible_distribution' from source: facts 12372 1727204080.85347: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.85357: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.85360: when evaluation is False, skipping this task 12372 1727204080.85362: _execute() done 12372 1727204080.85367: dumping result to json 12372 1727204080.85374: done dumping result, returning 12372 1727204080.85384: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-244a-02f9-000000000089] 12372 1727204080.85387: sending task result for task 12b410aa-8751-244a-02f9-000000000089 12372 1727204080.85490: done sending task result for task 12b410aa-8751-244a-02f9-000000000089 12372 1727204080.85494: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204080.85546: no more pending results, returning what we have 12372 1727204080.85551: results queue empty 12372 1727204080.85552: checking for any_errors_fatal 12372 1727204080.85559: done checking for any_errors_fatal 12372 1727204080.85560: checking for max_fail_percentage 12372 1727204080.85562: done checking for max_fail_percentage 12372 1727204080.85563: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.85564: done checking to see if all hosts have failed 12372 1727204080.85565: getting the remaining hosts for this loop 12372 1727204080.85566: done getting the remaining hosts for this loop 12372 1727204080.85570: getting the next task for host managed-node3 12372 1727204080.85577: done getting next task for host managed-node3 12372 1727204080.85581: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204080.85584: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.85602: getting variables 12372 1727204080.85605: in VariableManager get_vars() 12372 1727204080.85658: Calling all_inventory to load vars for managed-node3 12372 1727204080.85661: Calling groups_inventory to load vars for managed-node3 12372 1727204080.85663: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.85673: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.85676: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.85680: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.85871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.86041: done with get_vars() 12372 1727204080.86050: done getting variables 12372 1727204080.86100: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.048) 0:00:07.846 ***** 12372 1727204080.86131: entering _queue_task() for managed-node3/copy 12372 1727204080.86342: worker is 1 (out of 1 available) 12372 1727204080.86358: exiting _queue_task() for managed-node3/copy 12372 1727204080.86370: done queuing things up, now waiting for results queue to drain 12372 1727204080.86372: waiting for pending results... 12372 1727204080.86541: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204080.86639: in run() - task 12b410aa-8751-244a-02f9-00000000008a 12372 1727204080.86651: variable 'ansible_search_path' from source: unknown 12372 1727204080.86655: variable 'ansible_search_path' from source: unknown 12372 1727204080.86686: calling self._execute() 12372 1727204080.86757: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.86763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.86773: variable 'omit' from source: magic vars 12372 1727204080.87127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.89032: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.89082: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.89123: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.89148: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.89171: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.89241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.89265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.89286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.89325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.89338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.89445: variable 'ansible_distribution' from source: facts 12372 1727204080.89455: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.89466: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.89469: when evaluation is False, skipping this task 12372 1727204080.89472: _execute() done 12372 1727204080.89474: dumping result to json 12372 1727204080.89480: done dumping result, returning 12372 1727204080.89488: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-244a-02f9-00000000008a] 12372 1727204080.89496: sending task result for task 12b410aa-8751-244a-02f9-00000000008a 12372 1727204080.89586: done sending task result for task 12b410aa-8751-244a-02f9-00000000008a 12372 1727204080.89592: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.89641: no more pending results, returning what we have 12372 1727204080.89645: results queue empty 12372 1727204080.89646: checking for any_errors_fatal 12372 1727204080.89651: done checking for any_errors_fatal 12372 1727204080.89652: checking for max_fail_percentage 12372 1727204080.89654: done checking for max_fail_percentage 12372 1727204080.89655: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.89656: done checking to see if all hosts have failed 12372 1727204080.89657: getting the remaining hosts for this loop 12372 1727204080.89658: done getting the remaining hosts for this loop 12372 1727204080.89663: getting the next task for host managed-node3 12372 1727204080.89669: done getting next task for host managed-node3 12372 1727204080.89673: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204080.89676: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.89695: getting variables 12372 1727204080.89697: in VariableManager get_vars() 12372 1727204080.89747: Calling all_inventory to load vars for managed-node3 12372 1727204080.89751: Calling groups_inventory to load vars for managed-node3 12372 1727204080.89753: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.89762: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.89765: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.89768: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.89929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.90115: done with get_vars() 12372 1727204080.90124: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.040) 0:00:07.887 ***** 12372 1727204080.90191: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204080.90393: worker is 1 (out of 1 available) 12372 1727204080.90409: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204080.90421: done queuing things up, now waiting for results queue to drain 12372 1727204080.90423: waiting for pending results... 12372 1727204080.90598: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204080.90691: in run() - task 12b410aa-8751-244a-02f9-00000000008b 12372 1727204080.90704: variable 'ansible_search_path' from source: unknown 12372 1727204080.90707: variable 'ansible_search_path' from source: unknown 12372 1727204080.90740: calling self._execute() 12372 1727204080.90811: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.90817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.90829: variable 'omit' from source: magic vars 12372 1727204080.91190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.93123: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.93176: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.93208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.93251: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.93277: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.93345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.93369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.93395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.93431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.93444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.93550: variable 'ansible_distribution' from source: facts 12372 1727204080.93556: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.93566: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.93569: when evaluation is False, skipping this task 12372 1727204080.93572: _execute() done 12372 1727204080.93575: dumping result to json 12372 1727204080.93580: done dumping result, returning 12372 1727204080.93588: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-244a-02f9-00000000008b] 12372 1727204080.93596: sending task result for task 12b410aa-8751-244a-02f9-00000000008b 12372 1727204080.93698: done sending task result for task 12b410aa-8751-244a-02f9-00000000008b 12372 1727204080.93701: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.93758: no more pending results, returning what we have 12372 1727204080.93761: results queue empty 12372 1727204080.93762: checking for any_errors_fatal 12372 1727204080.93768: done checking for any_errors_fatal 12372 1727204080.93769: checking for max_fail_percentage 12372 1727204080.93771: done checking for max_fail_percentage 12372 1727204080.93772: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.93773: done checking to see if all hosts have failed 12372 1727204080.93774: getting the remaining hosts for this loop 12372 1727204080.93775: done getting the remaining hosts for this loop 12372 1727204080.93778: getting the next task for host managed-node3 12372 1727204080.93784: done getting next task for host managed-node3 12372 1727204080.93788: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204080.93810: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.93829: getting variables 12372 1727204080.93831: in VariableManager get_vars() 12372 1727204080.93874: Calling all_inventory to load vars for managed-node3 12372 1727204080.93876: Calling groups_inventory to load vars for managed-node3 12372 1727204080.93879: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.93886: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.93888: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.93893: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.94053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.94220: done with get_vars() 12372 1727204080.94228: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.041) 0:00:07.928 ***** 12372 1727204080.94294: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204080.94493: worker is 1 (out of 1 available) 12372 1727204080.94507: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204080.94522: done queuing things up, now waiting for results queue to drain 12372 1727204080.94524: waiting for pending results... 12372 1727204080.94688: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204080.94783: in run() - task 12b410aa-8751-244a-02f9-00000000008c 12372 1727204080.94798: variable 'ansible_search_path' from source: unknown 12372 1727204080.94801: variable 'ansible_search_path' from source: unknown 12372 1727204080.94834: calling self._execute() 12372 1727204080.94899: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.94906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.94918: variable 'omit' from source: magic vars 12372 1727204080.95259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204080.97144: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204080.97198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204080.97230: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204080.97261: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204080.97286: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204080.97352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204080.97378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204080.97404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204080.97438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204080.97450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204080.97558: variable 'ansible_distribution' from source: facts 12372 1727204080.97562: variable 'ansible_distribution_major_version' from source: facts 12372 1727204080.97573: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204080.97576: when evaluation is False, skipping this task 12372 1727204080.97579: _execute() done 12372 1727204080.97584: dumping result to json 12372 1727204080.97589: done dumping result, returning 12372 1727204080.97600: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-244a-02f9-00000000008c] 12372 1727204080.97605: sending task result for task 12b410aa-8751-244a-02f9-00000000008c 12372 1727204080.97695: done sending task result for task 12b410aa-8751-244a-02f9-00000000008c 12372 1727204080.97698: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204080.97755: no more pending results, returning what we have 12372 1727204080.97760: results queue empty 12372 1727204080.97761: checking for any_errors_fatal 12372 1727204080.97767: done checking for any_errors_fatal 12372 1727204080.97768: checking for max_fail_percentage 12372 1727204080.97769: done checking for max_fail_percentage 12372 1727204080.97770: checking to see if all hosts have failed and the running result is not ok 12372 1727204080.97772: done checking to see if all hosts have failed 12372 1727204080.97773: getting the remaining hosts for this loop 12372 1727204080.97774: done getting the remaining hosts for this loop 12372 1727204080.97779: getting the next task for host managed-node3 12372 1727204080.97785: done getting next task for host managed-node3 12372 1727204080.97796: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204080.97800: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204080.97819: getting variables 12372 1727204080.97821: in VariableManager get_vars() 12372 1727204080.97870: Calling all_inventory to load vars for managed-node3 12372 1727204080.97873: Calling groups_inventory to load vars for managed-node3 12372 1727204080.97876: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204080.97883: Calling all_plugins_play to load vars for managed-node3 12372 1727204080.97885: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204080.97888: Calling groups_plugins_play to load vars for managed-node3 12372 1727204080.98070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204080.98239: done with get_vars() 12372 1727204080.98248: done getting variables 12372 1727204080.98309: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.040) 0:00:07.969 ***** 12372 1727204080.98335: entering _queue_task() for managed-node3/debug 12372 1727204080.98537: worker is 1 (out of 1 available) 12372 1727204080.98552: exiting _queue_task() for managed-node3/debug 12372 1727204080.98564: done queuing things up, now waiting for results queue to drain 12372 1727204080.98566: waiting for pending results... 12372 1727204080.98752: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204080.98850: in run() - task 12b410aa-8751-244a-02f9-00000000008d 12372 1727204080.98863: variable 'ansible_search_path' from source: unknown 12372 1727204080.98867: variable 'ansible_search_path' from source: unknown 12372 1727204080.98902: calling self._execute() 12372 1727204080.98983: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204080.98991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204080.99001: variable 'omit' from source: magic vars 12372 1727204080.99380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.01371: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.01430: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.01466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.01497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.01522: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.01610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.01633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.01659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.01692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.01706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.01814: variable 'ansible_distribution' from source: facts 12372 1727204081.01820: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.01830: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.01833: when evaluation is False, skipping this task 12372 1727204081.01836: _execute() done 12372 1727204081.01839: dumping result to json 12372 1727204081.01844: done dumping result, returning 12372 1727204081.01854: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-244a-02f9-00000000008d] 12372 1727204081.01857: sending task result for task 12b410aa-8751-244a-02f9-00000000008d 12372 1727204081.01947: done sending task result for task 12b410aa-8751-244a-02f9-00000000008d 12372 1727204081.01951: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204081.02022: no more pending results, returning what we have 12372 1727204081.02026: results queue empty 12372 1727204081.02027: checking for any_errors_fatal 12372 1727204081.02032: done checking for any_errors_fatal 12372 1727204081.02033: checking for max_fail_percentage 12372 1727204081.02035: done checking for max_fail_percentage 12372 1727204081.02036: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.02037: done checking to see if all hosts have failed 12372 1727204081.02038: getting the remaining hosts for this loop 12372 1727204081.02039: done getting the remaining hosts for this loop 12372 1727204081.02043: getting the next task for host managed-node3 12372 1727204081.02049: done getting next task for host managed-node3 12372 1727204081.02054: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204081.02056: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.02072: getting variables 12372 1727204081.02074: in VariableManager get_vars() 12372 1727204081.02127: Calling all_inventory to load vars for managed-node3 12372 1727204081.02130: Calling groups_inventory to load vars for managed-node3 12372 1727204081.02133: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.02142: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.02145: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.02149: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.02301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.02480: done with get_vars() 12372 1727204081.02491: done getting variables 12372 1727204081.02537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.042) 0:00:08.011 ***** 12372 1727204081.02562: entering _queue_task() for managed-node3/debug 12372 1727204081.02766: worker is 1 (out of 1 available) 12372 1727204081.02781: exiting _queue_task() for managed-node3/debug 12372 1727204081.02795: done queuing things up, now waiting for results queue to drain 12372 1727204081.02797: waiting for pending results... 12372 1727204081.03106: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204081.03184: in run() - task 12b410aa-8751-244a-02f9-00000000008e 12372 1727204081.03210: variable 'ansible_search_path' from source: unknown 12372 1727204081.03228: variable 'ansible_search_path' from source: unknown 12372 1727204081.03277: calling self._execute() 12372 1727204081.03444: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.03448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.03450: variable 'omit' from source: magic vars 12372 1727204081.03990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.07136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.07225: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.07279: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.07335: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.07399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.07497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.07548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.07596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.07661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.07694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.07874: variable 'ansible_distribution' from source: facts 12372 1727204081.07943: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.07947: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.07950: when evaluation is False, skipping this task 12372 1727204081.07952: _execute() done 12372 1727204081.07955: dumping result to json 12372 1727204081.07957: done dumping result, returning 12372 1727204081.07959: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-244a-02f9-00000000008e] 12372 1727204081.07969: sending task result for task 12b410aa-8751-244a-02f9-00000000008e skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204081.08268: no more pending results, returning what we have 12372 1727204081.08273: results queue empty 12372 1727204081.08274: checking for any_errors_fatal 12372 1727204081.08280: done checking for any_errors_fatal 12372 1727204081.08281: checking for max_fail_percentage 12372 1727204081.08283: done checking for max_fail_percentage 12372 1727204081.08284: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.08285: done checking to see if all hosts have failed 12372 1727204081.08286: getting the remaining hosts for this loop 12372 1727204081.08288: done getting the remaining hosts for this loop 12372 1727204081.08296: getting the next task for host managed-node3 12372 1727204081.08308: done getting next task for host managed-node3 12372 1727204081.08313: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204081.08321: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.08340: getting variables 12372 1727204081.08343: in VariableManager get_vars() 12372 1727204081.08518: Calling all_inventory to load vars for managed-node3 12372 1727204081.08522: Calling groups_inventory to load vars for managed-node3 12372 1727204081.08526: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.08534: done sending task result for task 12b410aa-8751-244a-02f9-00000000008e 12372 1727204081.08537: WORKER PROCESS EXITING 12372 1727204081.08547: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.08551: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.08555: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.09244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.09555: done with get_vars() 12372 1727204081.09567: done getting variables 12372 1727204081.09649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.071) 0:00:08.082 ***** 12372 1727204081.09674: entering _queue_task() for managed-node3/debug 12372 1727204081.09890: worker is 1 (out of 1 available) 12372 1727204081.09904: exiting _queue_task() for managed-node3/debug 12372 1727204081.09918: done queuing things up, now waiting for results queue to drain 12372 1727204081.09920: waiting for pending results... 12372 1727204081.10121: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204081.10291: in run() - task 12b410aa-8751-244a-02f9-00000000008f 12372 1727204081.10304: variable 'ansible_search_path' from source: unknown 12372 1727204081.10308: variable 'ansible_search_path' from source: unknown 12372 1727204081.10344: calling self._execute() 12372 1727204081.10418: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.10427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.10438: variable 'omit' from source: magic vars 12372 1727204081.10798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.13745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.14075: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.14080: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.14101: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.14222: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.14446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.14495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.14650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.14804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.14843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.15044: variable 'ansible_distribution' from source: facts 12372 1727204081.15071: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.15091: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.15101: when evaluation is False, skipping this task 12372 1727204081.15108: _execute() done 12372 1727204081.15116: dumping result to json 12372 1727204081.15124: done dumping result, returning 12372 1727204081.15136: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-244a-02f9-00000000008f] 12372 1727204081.15146: sending task result for task 12b410aa-8751-244a-02f9-00000000008f skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204081.15439: no more pending results, returning what we have 12372 1727204081.15443: results queue empty 12372 1727204081.15444: checking for any_errors_fatal 12372 1727204081.15453: done checking for any_errors_fatal 12372 1727204081.15454: checking for max_fail_percentage 12372 1727204081.15456: done checking for max_fail_percentage 12372 1727204081.15456: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.15458: done checking to see if all hosts have failed 12372 1727204081.15459: getting the remaining hosts for this loop 12372 1727204081.15461: done getting the remaining hosts for this loop 12372 1727204081.15466: getting the next task for host managed-node3 12372 1727204081.15475: done getting next task for host managed-node3 12372 1727204081.15479: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204081.15483: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.15513: getting variables 12372 1727204081.15515: in VariableManager get_vars() 12372 1727204081.15583: Calling all_inventory to load vars for managed-node3 12372 1727204081.15587: Calling groups_inventory to load vars for managed-node3 12372 1727204081.15721: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.15729: done sending task result for task 12b410aa-8751-244a-02f9-00000000008f 12372 1727204081.15733: WORKER PROCESS EXITING 12372 1727204081.15742: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.15746: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.15750: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.16192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.16618: done with get_vars() 12372 1727204081.16633: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.070) 0:00:08.153 ***** 12372 1727204081.16761: entering _queue_task() for managed-node3/ping 12372 1727204081.17022: worker is 1 (out of 1 available) 12372 1727204081.17150: exiting _queue_task() for managed-node3/ping 12372 1727204081.17162: done queuing things up, now waiting for results queue to drain 12372 1727204081.17165: waiting for pending results... 12372 1727204081.17352: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204081.17461: in run() - task 12b410aa-8751-244a-02f9-000000000090 12372 1727204081.17475: variable 'ansible_search_path' from source: unknown 12372 1727204081.17479: variable 'ansible_search_path' from source: unknown 12372 1727204081.17513: calling self._execute() 12372 1727204081.17583: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.17593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.17603: variable 'omit' from source: magic vars 12372 1727204081.18022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.20996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.21000: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.21003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.21030: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.21066: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.21181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.21239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.21277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.21346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.21369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.21544: variable 'ansible_distribution' from source: facts 12372 1727204081.21564: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.21579: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.21585: when evaluation is False, skipping this task 12372 1727204081.21595: _execute() done 12372 1727204081.21601: dumping result to json 12372 1727204081.21608: done dumping result, returning 12372 1727204081.21622: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-244a-02f9-000000000090] 12372 1727204081.21631: sending task result for task 12b410aa-8751-244a-02f9-000000000090 12372 1727204081.21995: done sending task result for task 12b410aa-8751-244a-02f9-000000000090 12372 1727204081.21999: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.22086: no more pending results, returning what we have 12372 1727204081.22092: results queue empty 12372 1727204081.22094: checking for any_errors_fatal 12372 1727204081.22100: done checking for any_errors_fatal 12372 1727204081.22101: checking for max_fail_percentage 12372 1727204081.22103: done checking for max_fail_percentage 12372 1727204081.22104: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.22105: done checking to see if all hosts have failed 12372 1727204081.22107: getting the remaining hosts for this loop 12372 1727204081.22108: done getting the remaining hosts for this loop 12372 1727204081.22112: getting the next task for host managed-node3 12372 1727204081.22122: done getting next task for host managed-node3 12372 1727204081.22125: ^ task is: TASK: meta (role_complete) 12372 1727204081.22128: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.22146: getting variables 12372 1727204081.22148: in VariableManager get_vars() 12372 1727204081.22206: Calling all_inventory to load vars for managed-node3 12372 1727204081.22210: Calling groups_inventory to load vars for managed-node3 12372 1727204081.22213: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.22222: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.22226: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.22230: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.22523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.22814: done with get_vars() 12372 1727204081.22827: done getting variables 12372 1727204081.22920: done queuing things up, now waiting for results queue to drain 12372 1727204081.22922: results queue empty 12372 1727204081.22923: checking for any_errors_fatal 12372 1727204081.22926: done checking for any_errors_fatal 12372 1727204081.22927: checking for max_fail_percentage 12372 1727204081.22928: done checking for max_fail_percentage 12372 1727204081.22929: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.22930: done checking to see if all hosts have failed 12372 1727204081.22931: getting the remaining hosts for this loop 12372 1727204081.22932: done getting the remaining hosts for this loop 12372 1727204081.22935: getting the next task for host managed-node3 12372 1727204081.22940: done getting next task for host managed-node3 12372 1727204081.22942: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 12372 1727204081.22944: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.22947: getting variables 12372 1727204081.22948: in VariableManager get_vars() 12372 1727204081.22971: Calling all_inventory to load vars for managed-node3 12372 1727204081.22974: Calling groups_inventory to load vars for managed-node3 12372 1727204081.22977: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.22983: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.22986: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.22991: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.23182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.23467: done with get_vars() 12372 1727204081.23477: done getting variables 12372 1727204081.23526: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12372 1727204081.23668: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.069) 0:00:08.222 ***** 12372 1727204081.23702: entering _queue_task() for managed-node3/command 12372 1727204081.23991: worker is 1 (out of 1 available) 12372 1727204081.24005: exiting _queue_task() for managed-node3/command 12372 1727204081.24018: done queuing things up, now waiting for results queue to drain 12372 1727204081.24020: waiting for pending results... 12372 1727204081.24374: running TaskExecutor() for managed-node3/TASK: From the active connection, get the port1 profile "bond0.0" 12372 1727204081.24508: in run() - task 12b410aa-8751-244a-02f9-0000000000c0 12372 1727204081.24512: variable 'ansible_search_path' from source: unknown 12372 1727204081.24515: calling self._execute() 12372 1727204081.24567: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.24582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.24603: variable 'omit' from source: magic vars 12372 1727204081.25199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.27834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.27927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.27997: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.28040: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.28072: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.28164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.28208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.28297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.28307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.28331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.28501: variable 'ansible_distribution' from source: facts 12372 1727204081.28520: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.28539: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.28548: when evaluation is False, skipping this task 12372 1727204081.28555: _execute() done 12372 1727204081.28564: dumping result to json 12372 1727204081.28572: done dumping result, returning 12372 1727204081.28625: done running TaskExecutor() for managed-node3/TASK: From the active connection, get the port1 profile "bond0.0" [12b410aa-8751-244a-02f9-0000000000c0] 12372 1727204081.28629: sending task result for task 12b410aa-8751-244a-02f9-0000000000c0 12372 1727204081.28903: done sending task result for task 12b410aa-8751-244a-02f9-0000000000c0 12372 1727204081.28907: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.28967: no more pending results, returning what we have 12372 1727204081.28972: results queue empty 12372 1727204081.28973: checking for any_errors_fatal 12372 1727204081.28976: done checking for any_errors_fatal 12372 1727204081.28977: checking for max_fail_percentage 12372 1727204081.28979: done checking for max_fail_percentage 12372 1727204081.28980: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.28981: done checking to see if all hosts have failed 12372 1727204081.28982: getting the remaining hosts for this loop 12372 1727204081.28984: done getting the remaining hosts for this loop 12372 1727204081.28992: getting the next task for host managed-node3 12372 1727204081.28999: done getting next task for host managed-node3 12372 1727204081.29002: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 12372 1727204081.29005: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.29008: getting variables 12372 1727204081.29010: in VariableManager get_vars() 12372 1727204081.29071: Calling all_inventory to load vars for managed-node3 12372 1727204081.29076: Calling groups_inventory to load vars for managed-node3 12372 1727204081.29079: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.29274: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.29279: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.29285: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.29578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.29912: done with get_vars() 12372 1727204081.29932: done getting variables 12372 1727204081.30031: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12372 1727204081.30204: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.065) 0:00:08.288 ***** 12372 1727204081.30236: entering _queue_task() for managed-node3/command 12372 1727204081.30637: worker is 1 (out of 1 available) 12372 1727204081.30651: exiting _queue_task() for managed-node3/command 12372 1727204081.30666: done queuing things up, now waiting for results queue to drain 12372 1727204081.30668: waiting for pending results... 12372 1727204081.31009: running TaskExecutor() for managed-node3/TASK: From the active connection, get the port2 profile "bond0.1" 12372 1727204081.31059: in run() - task 12b410aa-8751-244a-02f9-0000000000c1 12372 1727204081.31080: variable 'ansible_search_path' from source: unknown 12372 1727204081.31132: calling self._execute() 12372 1727204081.31232: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.31246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.31262: variable 'omit' from source: magic vars 12372 1727204081.31785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.33796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.33800: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.33804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.33851: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.33888: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.33997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.34043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.34080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.34141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.34164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.34338: variable 'ansible_distribution' from source: facts 12372 1727204081.34352: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.34373: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.34383: when evaluation is False, skipping this task 12372 1727204081.34394: _execute() done 12372 1727204081.34404: dumping result to json 12372 1727204081.34414: done dumping result, returning 12372 1727204081.34431: done running TaskExecutor() for managed-node3/TASK: From the active connection, get the port2 profile "bond0.1" [12b410aa-8751-244a-02f9-0000000000c1] 12372 1727204081.34443: sending task result for task 12b410aa-8751-244a-02f9-0000000000c1 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.34711: no more pending results, returning what we have 12372 1727204081.34715: results queue empty 12372 1727204081.34718: checking for any_errors_fatal 12372 1727204081.34724: done checking for any_errors_fatal 12372 1727204081.34725: checking for max_fail_percentage 12372 1727204081.34727: done checking for max_fail_percentage 12372 1727204081.34727: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.34728: done checking to see if all hosts have failed 12372 1727204081.34729: getting the remaining hosts for this loop 12372 1727204081.34731: done getting the remaining hosts for this loop 12372 1727204081.34736: getting the next task for host managed-node3 12372 1727204081.34743: done getting next task for host managed-node3 12372 1727204081.34746: ^ task is: TASK: Assert that the port1 profile is not activated 12372 1727204081.34749: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.34752: getting variables 12372 1727204081.34754: in VariableManager get_vars() 12372 1727204081.34813: Calling all_inventory to load vars for managed-node3 12372 1727204081.34819: Calling groups_inventory to load vars for managed-node3 12372 1727204081.34822: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.34829: done sending task result for task 12b410aa-8751-244a-02f9-0000000000c1 12372 1727204081.34832: WORKER PROCESS EXITING 12372 1727204081.34841: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.34845: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.34849: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.35031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.35224: done with get_vars() 12372 1727204081.35233: done getting variables 12372 1727204081.35278: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.050) 0:00:08.338 ***** 12372 1727204081.35304: entering _queue_task() for managed-node3/assert 12372 1727204081.35519: worker is 1 (out of 1 available) 12372 1727204081.35534: exiting _queue_task() for managed-node3/assert 12372 1727204081.35549: done queuing things up, now waiting for results queue to drain 12372 1727204081.35551: waiting for pending results... 12372 1727204081.35723: running TaskExecutor() for managed-node3/TASK: Assert that the port1 profile is not activated 12372 1727204081.35787: in run() - task 12b410aa-8751-244a-02f9-0000000000c2 12372 1727204081.35802: variable 'ansible_search_path' from source: unknown 12372 1727204081.35835: calling self._execute() 12372 1727204081.35918: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.35923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.35933: variable 'omit' from source: magic vars 12372 1727204081.36294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.38749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.38809: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.38841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.38875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.38902: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.38975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.38999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.39023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.39055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.39071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.39186: variable 'ansible_distribution' from source: facts 12372 1727204081.39194: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.39205: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.39208: when evaluation is False, skipping this task 12372 1727204081.39212: _execute() done 12372 1727204081.39217: dumping result to json 12372 1727204081.39223: done dumping result, returning 12372 1727204081.39231: done running TaskExecutor() for managed-node3/TASK: Assert that the port1 profile is not activated [12b410aa-8751-244a-02f9-0000000000c2] 12372 1727204081.39236: sending task result for task 12b410aa-8751-244a-02f9-0000000000c2 12372 1727204081.39330: done sending task result for task 12b410aa-8751-244a-02f9-0000000000c2 12372 1727204081.39333: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.39385: no more pending results, returning what we have 12372 1727204081.39391: results queue empty 12372 1727204081.39392: checking for any_errors_fatal 12372 1727204081.39399: done checking for any_errors_fatal 12372 1727204081.39399: checking for max_fail_percentage 12372 1727204081.39401: done checking for max_fail_percentage 12372 1727204081.39402: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.39403: done checking to see if all hosts have failed 12372 1727204081.39404: getting the remaining hosts for this loop 12372 1727204081.39406: done getting the remaining hosts for this loop 12372 1727204081.39410: getting the next task for host managed-node3 12372 1727204081.39416: done getting next task for host managed-node3 12372 1727204081.39419: ^ task is: TASK: Assert that the port2 profile is not activated 12372 1727204081.39421: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.39425: getting variables 12372 1727204081.39426: in VariableManager get_vars() 12372 1727204081.39478: Calling all_inventory to load vars for managed-node3 12372 1727204081.39481: Calling groups_inventory to load vars for managed-node3 12372 1727204081.39483: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.39501: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.39504: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.39508: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.39666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.39832: done with get_vars() 12372 1727204081.39842: done getting variables 12372 1727204081.39887: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.046) 0:00:08.384 ***** 12372 1727204081.39911: entering _queue_task() for managed-node3/assert 12372 1727204081.40261: worker is 1 (out of 1 available) 12372 1727204081.40274: exiting _queue_task() for managed-node3/assert 12372 1727204081.40286: done queuing things up, now waiting for results queue to drain 12372 1727204081.40288: waiting for pending results... 12372 1727204081.40508: running TaskExecutor() for managed-node3/TASK: Assert that the port2 profile is not activated 12372 1727204081.40520: in run() - task 12b410aa-8751-244a-02f9-0000000000c3 12372 1727204081.40535: variable 'ansible_search_path' from source: unknown 12372 1727204081.40578: calling self._execute() 12372 1727204081.40695: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.40702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.40826: variable 'omit' from source: magic vars 12372 1727204081.41311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.43903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.43957: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.43992: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.44024: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.44047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.44118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.44143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.44165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.44198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.44217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.44329: variable 'ansible_distribution' from source: facts 12372 1727204081.44335: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.44345: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.44349: when evaluation is False, skipping this task 12372 1727204081.44352: _execute() done 12372 1727204081.44356: dumping result to json 12372 1727204081.44361: done dumping result, returning 12372 1727204081.44368: done running TaskExecutor() for managed-node3/TASK: Assert that the port2 profile is not activated [12b410aa-8751-244a-02f9-0000000000c3] 12372 1727204081.44374: sending task result for task 12b410aa-8751-244a-02f9-0000000000c3 12372 1727204081.44467: done sending task result for task 12b410aa-8751-244a-02f9-0000000000c3 12372 1727204081.44470: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.44524: no more pending results, returning what we have 12372 1727204081.44527: results queue empty 12372 1727204081.44529: checking for any_errors_fatal 12372 1727204081.44536: done checking for any_errors_fatal 12372 1727204081.44537: checking for max_fail_percentage 12372 1727204081.44539: done checking for max_fail_percentage 12372 1727204081.44540: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.44541: done checking to see if all hosts have failed 12372 1727204081.44542: getting the remaining hosts for this loop 12372 1727204081.44543: done getting the remaining hosts for this loop 12372 1727204081.44547: getting the next task for host managed-node3 12372 1727204081.44553: done getting next task for host managed-node3 12372 1727204081.44556: ^ task is: TASK: Get the port1 device state 12372 1727204081.44558: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.44561: getting variables 12372 1727204081.44563: in VariableManager get_vars() 12372 1727204081.44619: Calling all_inventory to load vars for managed-node3 12372 1727204081.44623: Calling groups_inventory to load vars for managed-node3 12372 1727204081.44626: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.44635: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.44638: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.44641: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.44817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.44975: done with get_vars() 12372 1727204081.44983: done getting variables 12372 1727204081.45033: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.051) 0:00:08.436 ***** 12372 1727204081.45055: entering _queue_task() for managed-node3/command 12372 1727204081.45242: worker is 1 (out of 1 available) 12372 1727204081.45256: exiting _queue_task() for managed-node3/command 12372 1727204081.45269: done queuing things up, now waiting for results queue to drain 12372 1727204081.45271: waiting for pending results... 12372 1727204081.45441: running TaskExecutor() for managed-node3/TASK: Get the port1 device state 12372 1727204081.45508: in run() - task 12b410aa-8751-244a-02f9-0000000000c4 12372 1727204081.45520: variable 'ansible_search_path' from source: unknown 12372 1727204081.45560: calling self._execute() 12372 1727204081.45794: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.45798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.45801: variable 'omit' from source: magic vars 12372 1727204081.46192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.48082: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.48146: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.48178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.48221: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.48243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.48318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.48342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.48364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.48401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.48418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.48524: variable 'ansible_distribution' from source: facts 12372 1727204081.48532: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.48542: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.48545: when evaluation is False, skipping this task 12372 1727204081.48549: _execute() done 12372 1727204081.48554: dumping result to json 12372 1727204081.48558: done dumping result, returning 12372 1727204081.48565: done running TaskExecutor() for managed-node3/TASK: Get the port1 device state [12b410aa-8751-244a-02f9-0000000000c4] 12372 1727204081.48571: sending task result for task 12b410aa-8751-244a-02f9-0000000000c4 12372 1727204081.48664: done sending task result for task 12b410aa-8751-244a-02f9-0000000000c4 12372 1727204081.48667: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.48727: no more pending results, returning what we have 12372 1727204081.48731: results queue empty 12372 1727204081.48732: checking for any_errors_fatal 12372 1727204081.48739: done checking for any_errors_fatal 12372 1727204081.48740: checking for max_fail_percentage 12372 1727204081.48741: done checking for max_fail_percentage 12372 1727204081.48742: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.48743: done checking to see if all hosts have failed 12372 1727204081.48744: getting the remaining hosts for this loop 12372 1727204081.48746: done getting the remaining hosts for this loop 12372 1727204081.48750: getting the next task for host managed-node3 12372 1727204081.48755: done getting next task for host managed-node3 12372 1727204081.48758: ^ task is: TASK: Get the port2 device state 12372 1727204081.48761: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.48764: getting variables 12372 1727204081.48765: in VariableManager get_vars() 12372 1727204081.48815: Calling all_inventory to load vars for managed-node3 12372 1727204081.48821: Calling groups_inventory to load vars for managed-node3 12372 1727204081.48824: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.48834: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.48837: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.48840: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.48984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.49157: done with get_vars() 12372 1727204081.49165: done getting variables 12372 1727204081.49215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.041) 0:00:08.478 ***** 12372 1727204081.49238: entering _queue_task() for managed-node3/command 12372 1727204081.49429: worker is 1 (out of 1 available) 12372 1727204081.49442: exiting _queue_task() for managed-node3/command 12372 1727204081.49454: done queuing things up, now waiting for results queue to drain 12372 1727204081.49456: waiting for pending results... 12372 1727204081.49634: running TaskExecutor() for managed-node3/TASK: Get the port2 device state 12372 1727204081.49704: in run() - task 12b410aa-8751-244a-02f9-0000000000c5 12372 1727204081.49716: variable 'ansible_search_path' from source: unknown 12372 1727204081.49750: calling self._execute() 12372 1727204081.49831: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.49839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.49848: variable 'omit' from source: magic vars 12372 1727204081.50219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.52012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.52063: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.52100: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.52130: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.52152: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.52223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.52246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.52267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.52302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.52320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.52425: variable 'ansible_distribution' from source: facts 12372 1727204081.52429: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.52441: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.52444: when evaluation is False, skipping this task 12372 1727204081.52447: _execute() done 12372 1727204081.52450: dumping result to json 12372 1727204081.52455: done dumping result, returning 12372 1727204081.52463: done running TaskExecutor() for managed-node3/TASK: Get the port2 device state [12b410aa-8751-244a-02f9-0000000000c5] 12372 1727204081.52468: sending task result for task 12b410aa-8751-244a-02f9-0000000000c5 12372 1727204081.52557: done sending task result for task 12b410aa-8751-244a-02f9-0000000000c5 12372 1727204081.52560: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.52614: no more pending results, returning what we have 12372 1727204081.52620: results queue empty 12372 1727204081.52621: checking for any_errors_fatal 12372 1727204081.52627: done checking for any_errors_fatal 12372 1727204081.52628: checking for max_fail_percentage 12372 1727204081.52630: done checking for max_fail_percentage 12372 1727204081.52630: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.52632: done checking to see if all hosts have failed 12372 1727204081.52633: getting the remaining hosts for this loop 12372 1727204081.52634: done getting the remaining hosts for this loop 12372 1727204081.52638: getting the next task for host managed-node3 12372 1727204081.52644: done getting next task for host managed-node3 12372 1727204081.52646: ^ task is: TASK: Assert that the port1 device is in DOWN state 12372 1727204081.52648: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.52651: getting variables 12372 1727204081.52653: in VariableManager get_vars() 12372 1727204081.52702: Calling all_inventory to load vars for managed-node3 12372 1727204081.52705: Calling groups_inventory to load vars for managed-node3 12372 1727204081.52708: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.52720: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.52723: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.52726: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.52928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.53085: done with get_vars() 12372 1727204081.53094: done getting variables 12372 1727204081.53143: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.039) 0:00:08.517 ***** 12372 1727204081.53165: entering _queue_task() for managed-node3/assert 12372 1727204081.53358: worker is 1 (out of 1 available) 12372 1727204081.53373: exiting _queue_task() for managed-node3/assert 12372 1727204081.53386: done queuing things up, now waiting for results queue to drain 12372 1727204081.53388: waiting for pending results... 12372 1727204081.53553: running TaskExecutor() for managed-node3/TASK: Assert that the port1 device is in DOWN state 12372 1727204081.53625: in run() - task 12b410aa-8751-244a-02f9-0000000000c6 12372 1727204081.53636: variable 'ansible_search_path' from source: unknown 12372 1727204081.53668: calling self._execute() 12372 1727204081.53743: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.53747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.53757: variable 'omit' from source: magic vars 12372 1727204081.54108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.55801: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.55854: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.55884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.55921: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.55941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.56007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.56037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.56058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.56091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.56104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.56212: variable 'ansible_distribution' from source: facts 12372 1727204081.56220: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.56230: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.56233: when evaluation is False, skipping this task 12372 1727204081.56236: _execute() done 12372 1727204081.56240: dumping result to json 12372 1727204081.56251: done dumping result, returning 12372 1727204081.56254: done running TaskExecutor() for managed-node3/TASK: Assert that the port1 device is in DOWN state [12b410aa-8751-244a-02f9-0000000000c6] 12372 1727204081.56259: sending task result for task 12b410aa-8751-244a-02f9-0000000000c6 12372 1727204081.56359: done sending task result for task 12b410aa-8751-244a-02f9-0000000000c6 12372 1727204081.56362: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.56413: no more pending results, returning what we have 12372 1727204081.56419: results queue empty 12372 1727204081.56420: checking for any_errors_fatal 12372 1727204081.56425: done checking for any_errors_fatal 12372 1727204081.56426: checking for max_fail_percentage 12372 1727204081.56428: done checking for max_fail_percentage 12372 1727204081.56429: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.56430: done checking to see if all hosts have failed 12372 1727204081.56431: getting the remaining hosts for this loop 12372 1727204081.56433: done getting the remaining hosts for this loop 12372 1727204081.56436: getting the next task for host managed-node3 12372 1727204081.56442: done getting next task for host managed-node3 12372 1727204081.56444: ^ task is: TASK: Assert that the port2 device is in DOWN state 12372 1727204081.56446: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.56450: getting variables 12372 1727204081.56451: in VariableManager get_vars() 12372 1727204081.56500: Calling all_inventory to load vars for managed-node3 12372 1727204081.56503: Calling groups_inventory to load vars for managed-node3 12372 1727204081.56506: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.56515: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.56520: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.56523: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.56676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.56842: done with get_vars() 12372 1727204081.56851: done getting variables 12372 1727204081.56904: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.037) 0:00:08.554 ***** 12372 1727204081.56927: entering _queue_task() for managed-node3/assert 12372 1727204081.57154: worker is 1 (out of 1 available) 12372 1727204081.57170: exiting _queue_task() for managed-node3/assert 12372 1727204081.57183: done queuing things up, now waiting for results queue to drain 12372 1727204081.57185: waiting for pending results... 12372 1727204081.57522: running TaskExecutor() for managed-node3/TASK: Assert that the port2 device is in DOWN state 12372 1727204081.57602: in run() - task 12b410aa-8751-244a-02f9-0000000000c7 12372 1727204081.57630: variable 'ansible_search_path' from source: unknown 12372 1727204081.57726: calling self._execute() 12372 1727204081.57779: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.57796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.57812: variable 'omit' from source: magic vars 12372 1727204081.58417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.60139: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.60197: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.60394: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.60397: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.60400: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.60421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.60462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.60500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.60558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.60581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.60740: variable 'ansible_distribution' from source: facts 12372 1727204081.60754: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.60772: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.60780: when evaluation is False, skipping this task 12372 1727204081.60787: _execute() done 12372 1727204081.60799: dumping result to json 12372 1727204081.60808: done dumping result, returning 12372 1727204081.60823: done running TaskExecutor() for managed-node3/TASK: Assert that the port2 device is in DOWN state [12b410aa-8751-244a-02f9-0000000000c7] 12372 1727204081.60834: sending task result for task 12b410aa-8751-244a-02f9-0000000000c7 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.61000: no more pending results, returning what we have 12372 1727204081.61035: results queue empty 12372 1727204081.61037: checking for any_errors_fatal 12372 1727204081.61043: done checking for any_errors_fatal 12372 1727204081.61044: checking for max_fail_percentage 12372 1727204081.61046: done checking for max_fail_percentage 12372 1727204081.61047: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.61048: done checking to see if all hosts have failed 12372 1727204081.61049: getting the remaining hosts for this loop 12372 1727204081.61051: done getting the remaining hosts for this loop 12372 1727204081.61055: getting the next task for host managed-node3 12372 1727204081.61063: done getting next task for host managed-node3 12372 1727204081.61068: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204081.61071: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.61095: getting variables 12372 1727204081.61097: in VariableManager get_vars() 12372 1727204081.61251: Calling all_inventory to load vars for managed-node3 12372 1727204081.61255: Calling groups_inventory to load vars for managed-node3 12372 1727204081.61258: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.61264: done sending task result for task 12b410aa-8751-244a-02f9-0000000000c7 12372 1727204081.61267: WORKER PROCESS EXITING 12372 1727204081.61277: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.61280: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.61285: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.61577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.62015: done with get_vars() 12372 1727204081.62027: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.052) 0:00:08.607 ***** 12372 1727204081.62143: entering _queue_task() for managed-node3/include_tasks 12372 1727204081.62401: worker is 1 (out of 1 available) 12372 1727204081.62416: exiting _queue_task() for managed-node3/include_tasks 12372 1727204081.62502: done queuing things up, now waiting for results queue to drain 12372 1727204081.62504: waiting for pending results... 12372 1727204081.62735: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204081.62903: in run() - task 12b410aa-8751-244a-02f9-0000000000cf 12372 1727204081.62925: variable 'ansible_search_path' from source: unknown 12372 1727204081.62932: variable 'ansible_search_path' from source: unknown 12372 1727204081.62981: calling self._execute() 12372 1727204081.63098: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.63113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.63131: variable 'omit' from source: magic vars 12372 1727204081.63701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.66473: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.66555: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.66610: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.66658: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.66709: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.66816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.66858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.66902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.66963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.66995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.67295: variable 'ansible_distribution' from source: facts 12372 1727204081.67299: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.67302: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.67304: when evaluation is False, skipping this task 12372 1727204081.67306: _execute() done 12372 1727204081.67309: dumping result to json 12372 1727204081.67312: done dumping result, returning 12372 1727204081.67314: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-244a-02f9-0000000000cf] 12372 1727204081.67316: sending task result for task 12b410aa-8751-244a-02f9-0000000000cf 12372 1727204081.67399: done sending task result for task 12b410aa-8751-244a-02f9-0000000000cf 12372 1727204081.67403: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.67458: no more pending results, returning what we have 12372 1727204081.67463: results queue empty 12372 1727204081.67464: checking for any_errors_fatal 12372 1727204081.67471: done checking for any_errors_fatal 12372 1727204081.67472: checking for max_fail_percentage 12372 1727204081.67474: done checking for max_fail_percentage 12372 1727204081.67475: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.67476: done checking to see if all hosts have failed 12372 1727204081.67477: getting the remaining hosts for this loop 12372 1727204081.67479: done getting the remaining hosts for this loop 12372 1727204081.67484: getting the next task for host managed-node3 12372 1727204081.67494: done getting next task for host managed-node3 12372 1727204081.67499: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204081.67503: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.67525: getting variables 12372 1727204081.67528: in VariableManager get_vars() 12372 1727204081.67711: Calling all_inventory to load vars for managed-node3 12372 1727204081.67715: Calling groups_inventory to load vars for managed-node3 12372 1727204081.67719: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.67730: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.67733: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.67738: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.68106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.68416: done with get_vars() 12372 1727204081.68429: done getting variables 12372 1727204081.68503: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.063) 0:00:08.671 ***** 12372 1727204081.68542: entering _queue_task() for managed-node3/debug 12372 1727204081.68915: worker is 1 (out of 1 available) 12372 1727204081.68928: exiting _queue_task() for managed-node3/debug 12372 1727204081.68941: done queuing things up, now waiting for results queue to drain 12372 1727204081.68943: waiting for pending results... 12372 1727204081.69358: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204081.69363: in run() - task 12b410aa-8751-244a-02f9-0000000000d0 12372 1727204081.69367: variable 'ansible_search_path' from source: unknown 12372 1727204081.69370: variable 'ansible_search_path' from source: unknown 12372 1727204081.69399: calling self._execute() 12372 1727204081.69497: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.69513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.69532: variable 'omit' from source: magic vars 12372 1727204081.70141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.72710: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.72815: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.72870: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.72921: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.72967: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.73073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.73172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.73176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.73225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.73249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.73428: variable 'ansible_distribution' from source: facts 12372 1727204081.73440: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.73496: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.73499: when evaluation is False, skipping this task 12372 1727204081.73507: _execute() done 12372 1727204081.73510: dumping result to json 12372 1727204081.73512: done dumping result, returning 12372 1727204081.73515: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-244a-02f9-0000000000d0] 12372 1727204081.73517: sending task result for task 12b410aa-8751-244a-02f9-0000000000d0 12372 1727204081.73718: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d0 12372 1727204081.73723: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204081.73776: no more pending results, returning what we have 12372 1727204081.73780: results queue empty 12372 1727204081.73781: checking for any_errors_fatal 12372 1727204081.73788: done checking for any_errors_fatal 12372 1727204081.73882: checking for max_fail_percentage 12372 1727204081.73885: done checking for max_fail_percentage 12372 1727204081.73887: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.73888: done checking to see if all hosts have failed 12372 1727204081.73891: getting the remaining hosts for this loop 12372 1727204081.73893: done getting the remaining hosts for this loop 12372 1727204081.73897: getting the next task for host managed-node3 12372 1727204081.73903: done getting next task for host managed-node3 12372 1727204081.73907: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204081.73911: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.73965: getting variables 12372 1727204081.73967: in VariableManager get_vars() 12372 1727204081.74051: Calling all_inventory to load vars for managed-node3 12372 1727204081.74101: Calling groups_inventory to load vars for managed-node3 12372 1727204081.74104: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.74111: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.74113: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.74118: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.74259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.74428: done with get_vars() 12372 1727204081.74437: done getting variables 12372 1727204081.74485: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.059) 0:00:08.730 ***** 12372 1727204081.74518: entering _queue_task() for managed-node3/fail 12372 1727204081.74723: worker is 1 (out of 1 available) 12372 1727204081.74739: exiting _queue_task() for managed-node3/fail 12372 1727204081.74753: done queuing things up, now waiting for results queue to drain 12372 1727204081.74755: waiting for pending results... 12372 1727204081.74929: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204081.75030: in run() - task 12b410aa-8751-244a-02f9-0000000000d1 12372 1727204081.75041: variable 'ansible_search_path' from source: unknown 12372 1727204081.75045: variable 'ansible_search_path' from source: unknown 12372 1727204081.75079: calling self._execute() 12372 1727204081.75151: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.75158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.75168: variable 'omit' from source: magic vars 12372 1727204081.75527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.77665: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.77726: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.77755: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.77789: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.77812: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.77879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.77910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.77933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.77967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.77980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.78094: variable 'ansible_distribution' from source: facts 12372 1727204081.78100: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.78112: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.78115: when evaluation is False, skipping this task 12372 1727204081.78122: _execute() done 12372 1727204081.78125: dumping result to json 12372 1727204081.78127: done dumping result, returning 12372 1727204081.78139: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-244a-02f9-0000000000d1] 12372 1727204081.78142: sending task result for task 12b410aa-8751-244a-02f9-0000000000d1 12372 1727204081.78245: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d1 12372 1727204081.78250: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.78301: no more pending results, returning what we have 12372 1727204081.78305: results queue empty 12372 1727204081.78306: checking for any_errors_fatal 12372 1727204081.78311: done checking for any_errors_fatal 12372 1727204081.78312: checking for max_fail_percentage 12372 1727204081.78314: done checking for max_fail_percentage 12372 1727204081.78318: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.78319: done checking to see if all hosts have failed 12372 1727204081.78320: getting the remaining hosts for this loop 12372 1727204081.78321: done getting the remaining hosts for this loop 12372 1727204081.78326: getting the next task for host managed-node3 12372 1727204081.78332: done getting next task for host managed-node3 12372 1727204081.78336: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204081.78339: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.78360: getting variables 12372 1727204081.78362: in VariableManager get_vars() 12372 1727204081.78422: Calling all_inventory to load vars for managed-node3 12372 1727204081.78425: Calling groups_inventory to load vars for managed-node3 12372 1727204081.78428: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.78437: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.78440: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.78443: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.78587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.78781: done with get_vars() 12372 1727204081.78791: done getting variables 12372 1727204081.78842: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.043) 0:00:08.774 ***** 12372 1727204081.78868: entering _queue_task() for managed-node3/fail 12372 1727204081.79074: worker is 1 (out of 1 available) 12372 1727204081.79093: exiting _queue_task() for managed-node3/fail 12372 1727204081.79107: done queuing things up, now waiting for results queue to drain 12372 1727204081.79109: waiting for pending results... 12372 1727204081.79408: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204081.79494: in run() - task 12b410aa-8751-244a-02f9-0000000000d2 12372 1727204081.79521: variable 'ansible_search_path' from source: unknown 12372 1727204081.79532: variable 'ansible_search_path' from source: unknown 12372 1727204081.79577: calling self._execute() 12372 1727204081.79678: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.79696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.79712: variable 'omit' from source: magic vars 12372 1727204081.80281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.82210: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.82275: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.82307: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.82339: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.82362: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.82454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.82593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.82597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.82605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.82614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.82898: variable 'ansible_distribution' from source: facts 12372 1727204081.82904: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.82906: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.82909: when evaluation is False, skipping this task 12372 1727204081.82914: _execute() done 12372 1727204081.82918: dumping result to json 12372 1727204081.82920: done dumping result, returning 12372 1727204081.82922: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-244a-02f9-0000000000d2] 12372 1727204081.82924: sending task result for task 12b410aa-8751-244a-02f9-0000000000d2 12372 1727204081.82999: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d2 12372 1727204081.83003: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.83058: no more pending results, returning what we have 12372 1727204081.83062: results queue empty 12372 1727204081.83063: checking for any_errors_fatal 12372 1727204081.83071: done checking for any_errors_fatal 12372 1727204081.83072: checking for max_fail_percentage 12372 1727204081.83074: done checking for max_fail_percentage 12372 1727204081.83075: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.83076: done checking to see if all hosts have failed 12372 1727204081.83077: getting the remaining hosts for this loop 12372 1727204081.83078: done getting the remaining hosts for this loop 12372 1727204081.83083: getting the next task for host managed-node3 12372 1727204081.83093: done getting next task for host managed-node3 12372 1727204081.83097: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204081.83100: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.83120: getting variables 12372 1727204081.83121: in VariableManager get_vars() 12372 1727204081.83175: Calling all_inventory to load vars for managed-node3 12372 1727204081.83178: Calling groups_inventory to load vars for managed-node3 12372 1727204081.83181: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.83247: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.83252: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.83257: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.83503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.83819: done with get_vars() 12372 1727204081.83832: done getting variables 12372 1727204081.83900: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.050) 0:00:08.825 ***** 12372 1727204081.83940: entering _queue_task() for managed-node3/fail 12372 1727204081.84188: worker is 1 (out of 1 available) 12372 1727204081.84208: exiting _queue_task() for managed-node3/fail 12372 1727204081.84224: done queuing things up, now waiting for results queue to drain 12372 1727204081.84226: waiting for pending results... 12372 1727204081.84503: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204081.84698: in run() - task 12b410aa-8751-244a-02f9-0000000000d3 12372 1727204081.84703: variable 'ansible_search_path' from source: unknown 12372 1727204081.84706: variable 'ansible_search_path' from source: unknown 12372 1727204081.84737: calling self._execute() 12372 1727204081.84841: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.84855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.84871: variable 'omit' from source: magic vars 12372 1727204081.85430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.87395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.87398: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.87401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.87404: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.87440: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.87541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.87585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.87632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.87691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.87715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.87887: variable 'ansible_distribution' from source: facts 12372 1727204081.87904: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.87925: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.87933: when evaluation is False, skipping this task 12372 1727204081.87941: _execute() done 12372 1727204081.87949: dumping result to json 12372 1727204081.87958: done dumping result, returning 12372 1727204081.87971: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-244a-02f9-0000000000d3] 12372 1727204081.87982: sending task result for task 12b410aa-8751-244a-02f9-0000000000d3 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.88164: no more pending results, returning what we have 12372 1727204081.88169: results queue empty 12372 1727204081.88170: checking for any_errors_fatal 12372 1727204081.88179: done checking for any_errors_fatal 12372 1727204081.88180: checking for max_fail_percentage 12372 1727204081.88182: done checking for max_fail_percentage 12372 1727204081.88182: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.88183: done checking to see if all hosts have failed 12372 1727204081.88184: getting the remaining hosts for this loop 12372 1727204081.88186: done getting the remaining hosts for this loop 12372 1727204081.88195: getting the next task for host managed-node3 12372 1727204081.88202: done getting next task for host managed-node3 12372 1727204081.88206: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204081.88209: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.88222: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d3 12372 1727204081.88226: WORKER PROCESS EXITING 12372 1727204081.88307: getting variables 12372 1727204081.88310: in VariableManager get_vars() 12372 1727204081.88377: Calling all_inventory to load vars for managed-node3 12372 1727204081.88384: Calling groups_inventory to load vars for managed-node3 12372 1727204081.88388: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.88423: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.88427: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.88431: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.88640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.88810: done with get_vars() 12372 1727204081.88821: done getting variables 12372 1727204081.88866: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.049) 0:00:08.874 ***** 12372 1727204081.88895: entering _queue_task() for managed-node3/dnf 12372 1727204081.89097: worker is 1 (out of 1 available) 12372 1727204081.89112: exiting _queue_task() for managed-node3/dnf 12372 1727204081.89126: done queuing things up, now waiting for results queue to drain 12372 1727204081.89128: waiting for pending results... 12372 1727204081.89297: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204081.89397: in run() - task 12b410aa-8751-244a-02f9-0000000000d4 12372 1727204081.89409: variable 'ansible_search_path' from source: unknown 12372 1727204081.89413: variable 'ansible_search_path' from source: unknown 12372 1727204081.89446: calling self._execute() 12372 1727204081.89515: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.89522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.89532: variable 'omit' from source: magic vars 12372 1727204081.89884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.91591: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.91862: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.91894: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.91924: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.91948: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.92021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.92043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.92065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.92104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.92120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.92229: variable 'ansible_distribution' from source: facts 12372 1727204081.92236: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.92246: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.92250: when evaluation is False, skipping this task 12372 1727204081.92253: _execute() done 12372 1727204081.92255: dumping result to json 12372 1727204081.92261: done dumping result, returning 12372 1727204081.92269: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-0000000000d4] 12372 1727204081.92274: sending task result for task 12b410aa-8751-244a-02f9-0000000000d4 12372 1727204081.92371: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d4 12372 1727204081.92374: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.92455: no more pending results, returning what we have 12372 1727204081.92458: results queue empty 12372 1727204081.92459: checking for any_errors_fatal 12372 1727204081.92465: done checking for any_errors_fatal 12372 1727204081.92466: checking for max_fail_percentage 12372 1727204081.92467: done checking for max_fail_percentage 12372 1727204081.92468: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.92470: done checking to see if all hosts have failed 12372 1727204081.92470: getting the remaining hosts for this loop 12372 1727204081.92472: done getting the remaining hosts for this loop 12372 1727204081.92476: getting the next task for host managed-node3 12372 1727204081.92481: done getting next task for host managed-node3 12372 1727204081.92493: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204081.92497: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.92515: getting variables 12372 1727204081.92519: in VariableManager get_vars() 12372 1727204081.92567: Calling all_inventory to load vars for managed-node3 12372 1727204081.92570: Calling groups_inventory to load vars for managed-node3 12372 1727204081.92571: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.92578: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.92580: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.92582: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.92734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.92904: done with get_vars() 12372 1727204081.92912: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204081.92974: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.041) 0:00:08.915 ***** 12372 1727204081.93000: entering _queue_task() for managed-node3/yum 12372 1727204081.93211: worker is 1 (out of 1 available) 12372 1727204081.93230: exiting _queue_task() for managed-node3/yum 12372 1727204081.93243: done queuing things up, now waiting for results queue to drain 12372 1727204081.93245: waiting for pending results... 12372 1727204081.93425: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204081.93528: in run() - task 12b410aa-8751-244a-02f9-0000000000d5 12372 1727204081.93540: variable 'ansible_search_path' from source: unknown 12372 1727204081.93544: variable 'ansible_search_path' from source: unknown 12372 1727204081.93579: calling self._execute() 12372 1727204081.93655: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.93661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.93671: variable 'omit' from source: magic vars 12372 1727204081.94042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204081.95998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204081.96052: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204081.96081: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204081.96116: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204081.96142: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204081.96209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204081.96237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204081.96258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204081.96292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204081.96306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204081.96416: variable 'ansible_distribution' from source: facts 12372 1727204081.96424: variable 'ansible_distribution_major_version' from source: facts 12372 1727204081.96436: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204081.96441: when evaluation is False, skipping this task 12372 1727204081.96443: _execute() done 12372 1727204081.96446: dumping result to json 12372 1727204081.96452: done dumping result, returning 12372 1727204081.96459: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-0000000000d5] 12372 1727204081.96464: sending task result for task 12b410aa-8751-244a-02f9-0000000000d5 12372 1727204081.96556: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d5 12372 1727204081.96559: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204081.96612: no more pending results, returning what we have 12372 1727204081.96615: results queue empty 12372 1727204081.96616: checking for any_errors_fatal 12372 1727204081.96622: done checking for any_errors_fatal 12372 1727204081.96623: checking for max_fail_percentage 12372 1727204081.96624: done checking for max_fail_percentage 12372 1727204081.96625: checking to see if all hosts have failed and the running result is not ok 12372 1727204081.96626: done checking to see if all hosts have failed 12372 1727204081.96627: getting the remaining hosts for this loop 12372 1727204081.96629: done getting the remaining hosts for this loop 12372 1727204081.96633: getting the next task for host managed-node3 12372 1727204081.96639: done getting next task for host managed-node3 12372 1727204081.96644: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204081.96647: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204081.96664: getting variables 12372 1727204081.96665: in VariableManager get_vars() 12372 1727204081.96723: Calling all_inventory to load vars for managed-node3 12372 1727204081.96726: Calling groups_inventory to load vars for managed-node3 12372 1727204081.96729: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204081.96737: Calling all_plugins_play to load vars for managed-node3 12372 1727204081.96740: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204081.96742: Calling groups_plugins_play to load vars for managed-node3 12372 1727204081.96920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204081.97081: done with get_vars() 12372 1727204081.97091: done getting variables 12372 1727204081.97139: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.041) 0:00:08.957 ***** 12372 1727204081.97165: entering _queue_task() for managed-node3/fail 12372 1727204081.97363: worker is 1 (out of 1 available) 12372 1727204081.97380: exiting _queue_task() for managed-node3/fail 12372 1727204081.97396: done queuing things up, now waiting for results queue to drain 12372 1727204081.97398: waiting for pending results... 12372 1727204081.97572: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204081.97677: in run() - task 12b410aa-8751-244a-02f9-0000000000d6 12372 1727204081.97691: variable 'ansible_search_path' from source: unknown 12372 1727204081.97696: variable 'ansible_search_path' from source: unknown 12372 1727204081.97732: calling self._execute() 12372 1727204081.97802: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204081.97809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204081.97821: variable 'omit' from source: magic vars 12372 1727204081.98182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.00102: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.00168: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.00201: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.00231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.00256: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.00325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.00351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.00381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.00416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.00431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.00541: variable 'ansible_distribution' from source: facts 12372 1727204082.00544: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.00557: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.00560: when evaluation is False, skipping this task 12372 1727204082.00562: _execute() done 12372 1727204082.00565: dumping result to json 12372 1727204082.00571: done dumping result, returning 12372 1727204082.00581: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-0000000000d6] 12372 1727204082.00584: sending task result for task 12b410aa-8751-244a-02f9-0000000000d6 12372 1727204082.00677: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d6 12372 1727204082.00683: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.00738: no more pending results, returning what we have 12372 1727204082.00741: results queue empty 12372 1727204082.00742: checking for any_errors_fatal 12372 1727204082.00749: done checking for any_errors_fatal 12372 1727204082.00750: checking for max_fail_percentage 12372 1727204082.00751: done checking for max_fail_percentage 12372 1727204082.00752: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.00754: done checking to see if all hosts have failed 12372 1727204082.00754: getting the remaining hosts for this loop 12372 1727204082.00756: done getting the remaining hosts for this loop 12372 1727204082.00760: getting the next task for host managed-node3 12372 1727204082.00767: done getting next task for host managed-node3 12372 1727204082.00770: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12372 1727204082.00773: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.00798: getting variables 12372 1727204082.00800: in VariableManager get_vars() 12372 1727204082.00850: Calling all_inventory to load vars for managed-node3 12372 1727204082.00853: Calling groups_inventory to load vars for managed-node3 12372 1727204082.00855: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.00864: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.00867: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.00870: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.01036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.01208: done with get_vars() 12372 1727204082.01218: done getting variables 12372 1727204082.01265: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.041) 0:00:08.998 ***** 12372 1727204082.01295: entering _queue_task() for managed-node3/package 12372 1727204082.01503: worker is 1 (out of 1 available) 12372 1727204082.01521: exiting _queue_task() for managed-node3/package 12372 1727204082.01535: done queuing things up, now waiting for results queue to drain 12372 1727204082.01537: waiting for pending results... 12372 1727204082.01709: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 12372 1727204082.01808: in run() - task 12b410aa-8751-244a-02f9-0000000000d7 12372 1727204082.01881: variable 'ansible_search_path' from source: unknown 12372 1727204082.01885: variable 'ansible_search_path' from source: unknown 12372 1727204082.01888: calling self._execute() 12372 1727204082.01929: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.01935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.01945: variable 'omit' from source: magic vars 12372 1727204082.02370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.04243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.04299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.04331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.04362: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.04387: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.04465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.04494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.04520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.04551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.04564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.04672: variable 'ansible_distribution' from source: facts 12372 1727204082.04677: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.04688: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.04693: when evaluation is False, skipping this task 12372 1727204082.04696: _execute() done 12372 1727204082.04702: dumping result to json 12372 1727204082.04707: done dumping result, returning 12372 1727204082.04720: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-244a-02f9-0000000000d7] 12372 1727204082.04723: sending task result for task 12b410aa-8751-244a-02f9-0000000000d7 12372 1727204082.04819: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d7 12372 1727204082.04823: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.04876: no more pending results, returning what we have 12372 1727204082.04881: results queue empty 12372 1727204082.04883: checking for any_errors_fatal 12372 1727204082.04891: done checking for any_errors_fatal 12372 1727204082.04892: checking for max_fail_percentage 12372 1727204082.04894: done checking for max_fail_percentage 12372 1727204082.04895: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.04897: done checking to see if all hosts have failed 12372 1727204082.04898: getting the remaining hosts for this loop 12372 1727204082.04899: done getting the remaining hosts for this loop 12372 1727204082.04903: getting the next task for host managed-node3 12372 1727204082.04910: done getting next task for host managed-node3 12372 1727204082.04914: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204082.04920: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.04938: getting variables 12372 1727204082.04940: in VariableManager get_vars() 12372 1727204082.04998: Calling all_inventory to load vars for managed-node3 12372 1727204082.05002: Calling groups_inventory to load vars for managed-node3 12372 1727204082.05005: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.05014: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.05020: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.05024: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.05204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.05374: done with get_vars() 12372 1727204082.05383: done getting variables 12372 1727204082.05433: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.041) 0:00:09.040 ***** 12372 1727204082.05459: entering _queue_task() for managed-node3/package 12372 1727204082.05667: worker is 1 (out of 1 available) 12372 1727204082.05684: exiting _queue_task() for managed-node3/package 12372 1727204082.05699: done queuing things up, now waiting for results queue to drain 12372 1727204082.05701: waiting for pending results... 12372 1727204082.05871: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204082.05969: in run() - task 12b410aa-8751-244a-02f9-0000000000d8 12372 1727204082.05982: variable 'ansible_search_path' from source: unknown 12372 1727204082.05986: variable 'ansible_search_path' from source: unknown 12372 1727204082.06026: calling self._execute() 12372 1727204082.06106: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.06113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.06125: variable 'omit' from source: magic vars 12372 1727204082.06492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.08404: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.08459: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.08488: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.08521: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.08547: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.08613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.08640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.08662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.08700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.08713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.08823: variable 'ansible_distribution' from source: facts 12372 1727204082.08830: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.08840: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.08843: when evaluation is False, skipping this task 12372 1727204082.08845: _execute() done 12372 1727204082.08850: dumping result to json 12372 1727204082.08855: done dumping result, returning 12372 1727204082.08863: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-244a-02f9-0000000000d8] 12372 1727204082.08868: sending task result for task 12b410aa-8751-244a-02f9-0000000000d8 12372 1727204082.08964: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d8 12372 1727204082.08967: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.09029: no more pending results, returning what we have 12372 1727204082.09032: results queue empty 12372 1727204082.09033: checking for any_errors_fatal 12372 1727204082.09038: done checking for any_errors_fatal 12372 1727204082.09039: checking for max_fail_percentage 12372 1727204082.09041: done checking for max_fail_percentage 12372 1727204082.09042: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.09044: done checking to see if all hosts have failed 12372 1727204082.09045: getting the remaining hosts for this loop 12372 1727204082.09046: done getting the remaining hosts for this loop 12372 1727204082.09050: getting the next task for host managed-node3 12372 1727204082.09055: done getting next task for host managed-node3 12372 1727204082.09059: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204082.09062: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.09078: getting variables 12372 1727204082.09080: in VariableManager get_vars() 12372 1727204082.09129: Calling all_inventory to load vars for managed-node3 12372 1727204082.09132: Calling groups_inventory to load vars for managed-node3 12372 1727204082.09135: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.09143: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.09145: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.09147: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.09279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.09444: done with get_vars() 12372 1727204082.09453: done getting variables 12372 1727204082.09499: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.040) 0:00:09.080 ***** 12372 1727204082.09524: entering _queue_task() for managed-node3/package 12372 1727204082.09711: worker is 1 (out of 1 available) 12372 1727204082.09728: exiting _queue_task() for managed-node3/package 12372 1727204082.09740: done queuing things up, now waiting for results queue to drain 12372 1727204082.09742: waiting for pending results... 12372 1727204082.09920: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204082.10012: in run() - task 12b410aa-8751-244a-02f9-0000000000d9 12372 1727204082.10025: variable 'ansible_search_path' from source: unknown 12372 1727204082.10029: variable 'ansible_search_path' from source: unknown 12372 1727204082.10059: calling self._execute() 12372 1727204082.10132: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.10139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.10149: variable 'omit' from source: magic vars 12372 1727204082.10755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.12523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.12581: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.12613: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.12645: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.12667: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.12736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.12761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.12782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.12896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.12899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.12944: variable 'ansible_distribution' from source: facts 12372 1727204082.12949: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.12959: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.12963: when evaluation is False, skipping this task 12372 1727204082.12965: _execute() done 12372 1727204082.12970: dumping result to json 12372 1727204082.12975: done dumping result, returning 12372 1727204082.12982: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-244a-02f9-0000000000d9] 12372 1727204082.12988: sending task result for task 12b410aa-8751-244a-02f9-0000000000d9 12372 1727204082.13085: done sending task result for task 12b410aa-8751-244a-02f9-0000000000d9 12372 1727204082.13088: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.13162: no more pending results, returning what we have 12372 1727204082.13166: results queue empty 12372 1727204082.13167: checking for any_errors_fatal 12372 1727204082.13174: done checking for any_errors_fatal 12372 1727204082.13175: checking for max_fail_percentage 12372 1727204082.13176: done checking for max_fail_percentage 12372 1727204082.13177: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.13178: done checking to see if all hosts have failed 12372 1727204082.13179: getting the remaining hosts for this loop 12372 1727204082.13180: done getting the remaining hosts for this loop 12372 1727204082.13184: getting the next task for host managed-node3 12372 1727204082.13192: done getting next task for host managed-node3 12372 1727204082.13196: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204082.13199: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.13215: getting variables 12372 1727204082.13217: in VariableManager get_vars() 12372 1727204082.13485: Calling all_inventory to load vars for managed-node3 12372 1727204082.13488: Calling groups_inventory to load vars for managed-node3 12372 1727204082.13497: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.13505: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.13507: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.13512: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.13662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.13942: done with get_vars() 12372 1727204082.13953: done getting variables 12372 1727204082.14026: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.045) 0:00:09.126 ***** 12372 1727204082.14062: entering _queue_task() for managed-node3/service 12372 1727204082.14444: worker is 1 (out of 1 available) 12372 1727204082.14457: exiting _queue_task() for managed-node3/service 12372 1727204082.14468: done queuing things up, now waiting for results queue to drain 12372 1727204082.14470: waiting for pending results... 12372 1727204082.14675: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204082.14882: in run() - task 12b410aa-8751-244a-02f9-0000000000da 12372 1727204082.14886: variable 'ansible_search_path' from source: unknown 12372 1727204082.14890: variable 'ansible_search_path' from source: unknown 12372 1727204082.14929: calling self._execute() 12372 1727204082.15040: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.15099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.15103: variable 'omit' from source: magic vars 12372 1727204082.15630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.18550: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.18646: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.18698: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.18895: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.18899: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.18903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.19240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.19245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.19248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.19466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.19901: variable 'ansible_distribution' from source: facts 12372 1727204082.19915: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.19933: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.19974: when evaluation is False, skipping this task 12372 1727204082.19978: _execute() done 12372 1727204082.19981: dumping result to json 12372 1727204082.19984: done dumping result, returning 12372 1727204082.19986: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-0000000000da] 12372 1727204082.20095: sending task result for task 12b410aa-8751-244a-02f9-0000000000da 12372 1727204082.20184: done sending task result for task 12b410aa-8751-244a-02f9-0000000000da 12372 1727204082.20187: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.20252: no more pending results, returning what we have 12372 1727204082.20256: results queue empty 12372 1727204082.20258: checking for any_errors_fatal 12372 1727204082.20269: done checking for any_errors_fatal 12372 1727204082.20270: checking for max_fail_percentage 12372 1727204082.20272: done checking for max_fail_percentage 12372 1727204082.20273: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.20274: done checking to see if all hosts have failed 12372 1727204082.20275: getting the remaining hosts for this loop 12372 1727204082.20277: done getting the remaining hosts for this loop 12372 1727204082.20282: getting the next task for host managed-node3 12372 1727204082.20293: done getting next task for host managed-node3 12372 1727204082.20298: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204082.20304: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.20328: getting variables 12372 1727204082.20330: in VariableManager get_vars() 12372 1727204082.20636: Calling all_inventory to load vars for managed-node3 12372 1727204082.20640: Calling groups_inventory to load vars for managed-node3 12372 1727204082.20649: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.20660: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.20664: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.20668: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.21014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.21377: done with get_vars() 12372 1727204082.21396: done getting variables 12372 1727204082.21470: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.074) 0:00:09.200 ***** 12372 1727204082.21522: entering _queue_task() for managed-node3/service 12372 1727204082.22005: worker is 1 (out of 1 available) 12372 1727204082.22019: exiting _queue_task() for managed-node3/service 12372 1727204082.22031: done queuing things up, now waiting for results queue to drain 12372 1727204082.22033: waiting for pending results... 12372 1727204082.22393: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204082.22417: in run() - task 12b410aa-8751-244a-02f9-0000000000db 12372 1727204082.22441: variable 'ansible_search_path' from source: unknown 12372 1727204082.22452: variable 'ansible_search_path' from source: unknown 12372 1727204082.22514: calling self._execute() 12372 1727204082.22658: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.22673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.22693: variable 'omit' from source: magic vars 12372 1727204082.23264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.26168: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.26276: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.26337: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.26413: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.26434: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.26542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.26582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.26630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.26895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.26898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.26901: variable 'ansible_distribution' from source: facts 12372 1727204082.26903: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.26906: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.26908: when evaluation is False, skipping this task 12372 1727204082.26910: _execute() done 12372 1727204082.26912: dumping result to json 12372 1727204082.26914: done dumping result, returning 12372 1727204082.26940: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-244a-02f9-0000000000db] 12372 1727204082.26952: sending task result for task 12b410aa-8751-244a-02f9-0000000000db skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204082.27188: no more pending results, returning what we have 12372 1727204082.27194: results queue empty 12372 1727204082.27195: checking for any_errors_fatal 12372 1727204082.27204: done checking for any_errors_fatal 12372 1727204082.27205: checking for max_fail_percentage 12372 1727204082.27207: done checking for max_fail_percentage 12372 1727204082.27208: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.27209: done checking to see if all hosts have failed 12372 1727204082.27210: getting the remaining hosts for this loop 12372 1727204082.27212: done getting the remaining hosts for this loop 12372 1727204082.27217: getting the next task for host managed-node3 12372 1727204082.27225: done getting next task for host managed-node3 12372 1727204082.27229: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204082.27233: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.27255: done sending task result for task 12b410aa-8751-244a-02f9-0000000000db 12372 1727204082.27258: WORKER PROCESS EXITING 12372 1727204082.27371: getting variables 12372 1727204082.27374: in VariableManager get_vars() 12372 1727204082.27438: Calling all_inventory to load vars for managed-node3 12372 1727204082.27442: Calling groups_inventory to load vars for managed-node3 12372 1727204082.27445: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.27457: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.27460: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.27583: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.27893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.28222: done with get_vars() 12372 1727204082.28245: done getting variables 12372 1727204082.28316: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.068) 0:00:09.269 ***** 12372 1727204082.28364: entering _queue_task() for managed-node3/service 12372 1727204082.28782: worker is 1 (out of 1 available) 12372 1727204082.28799: exiting _queue_task() for managed-node3/service 12372 1727204082.28810: done queuing things up, now waiting for results queue to drain 12372 1727204082.28812: waiting for pending results... 12372 1727204082.29018: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204082.29219: in run() - task 12b410aa-8751-244a-02f9-0000000000dc 12372 1727204082.29224: variable 'ansible_search_path' from source: unknown 12372 1727204082.29295: variable 'ansible_search_path' from source: unknown 12372 1727204082.29298: calling self._execute() 12372 1727204082.29385: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.29400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.29422: variable 'omit' from source: magic vars 12372 1727204082.29994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.32904: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.33028: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.33064: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.33115: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.33245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.33276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.33319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.33368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.33428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.33460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.33648: variable 'ansible_distribution' from source: facts 12372 1727204082.33661: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.33687: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.33895: when evaluation is False, skipping this task 12372 1727204082.33898: _execute() done 12372 1727204082.33901: dumping result to json 12372 1727204082.33903: done dumping result, returning 12372 1727204082.33906: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-244a-02f9-0000000000dc] 12372 1727204082.33908: sending task result for task 12b410aa-8751-244a-02f9-0000000000dc 12372 1727204082.33994: done sending task result for task 12b410aa-8751-244a-02f9-0000000000dc 12372 1727204082.33998: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.34059: no more pending results, returning what we have 12372 1727204082.34064: results queue empty 12372 1727204082.34065: checking for any_errors_fatal 12372 1727204082.34074: done checking for any_errors_fatal 12372 1727204082.34075: checking for max_fail_percentage 12372 1727204082.34078: done checking for max_fail_percentage 12372 1727204082.34080: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.34082: done checking to see if all hosts have failed 12372 1727204082.34083: getting the remaining hosts for this loop 12372 1727204082.34084: done getting the remaining hosts for this loop 12372 1727204082.34091: getting the next task for host managed-node3 12372 1727204082.34099: done getting next task for host managed-node3 12372 1727204082.34104: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204082.34110: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.34132: getting variables 12372 1727204082.34134: in VariableManager get_vars() 12372 1727204082.34314: Calling all_inventory to load vars for managed-node3 12372 1727204082.34317: Calling groups_inventory to load vars for managed-node3 12372 1727204082.34321: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.34331: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.34335: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.34338: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.34716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.35004: done with get_vars() 12372 1727204082.35017: done getting variables 12372 1727204082.35085: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.067) 0:00:09.336 ***** 12372 1727204082.35127: entering _queue_task() for managed-node3/service 12372 1727204082.35466: worker is 1 (out of 1 available) 12372 1727204082.35482: exiting _queue_task() for managed-node3/service 12372 1727204082.35560: done queuing things up, now waiting for results queue to drain 12372 1727204082.35563: waiting for pending results... 12372 1727204082.35823: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204082.35995: in run() - task 12b410aa-8751-244a-02f9-0000000000dd 12372 1727204082.35999: variable 'ansible_search_path' from source: unknown 12372 1727204082.36002: variable 'ansible_search_path' from source: unknown 12372 1727204082.36004: calling self._execute() 12372 1727204082.36081: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.36098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.36115: variable 'omit' from source: magic vars 12372 1727204082.36657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.39422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.39527: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.39581: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.39632: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.39672: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.39775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.39820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.39973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.39977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.39980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.40109: variable 'ansible_distribution' from source: facts 12372 1727204082.40125: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.40140: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.40148: when evaluation is False, skipping this task 12372 1727204082.40156: _execute() done 12372 1727204082.40163: dumping result to json 12372 1727204082.40172: done dumping result, returning 12372 1727204082.40191: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-244a-02f9-0000000000dd] 12372 1727204082.40204: sending task result for task 12b410aa-8751-244a-02f9-0000000000dd 12372 1727204082.40499: done sending task result for task 12b410aa-8751-244a-02f9-0000000000dd 12372 1727204082.40503: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204082.40551: no more pending results, returning what we have 12372 1727204082.40554: results queue empty 12372 1727204082.40556: checking for any_errors_fatal 12372 1727204082.40563: done checking for any_errors_fatal 12372 1727204082.40564: checking for max_fail_percentage 12372 1727204082.40566: done checking for max_fail_percentage 12372 1727204082.40567: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.40568: done checking to see if all hosts have failed 12372 1727204082.40569: getting the remaining hosts for this loop 12372 1727204082.40571: done getting the remaining hosts for this loop 12372 1727204082.40575: getting the next task for host managed-node3 12372 1727204082.40582: done getting next task for host managed-node3 12372 1727204082.40586: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204082.40592: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.40614: getting variables 12372 1727204082.40618: in VariableManager get_vars() 12372 1727204082.40678: Calling all_inventory to load vars for managed-node3 12372 1727204082.40681: Calling groups_inventory to load vars for managed-node3 12372 1727204082.40684: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.40832: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.40836: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.40841: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.41071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.41348: done with get_vars() 12372 1727204082.41360: done getting variables 12372 1727204082.41428: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.063) 0:00:09.400 ***** 12372 1727204082.41469: entering _queue_task() for managed-node3/copy 12372 1727204082.41778: worker is 1 (out of 1 available) 12372 1727204082.41997: exiting _queue_task() for managed-node3/copy 12372 1727204082.42009: done queuing things up, now waiting for results queue to drain 12372 1727204082.42011: waiting for pending results... 12372 1727204082.42129: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204082.42282: in run() - task 12b410aa-8751-244a-02f9-0000000000de 12372 1727204082.42306: variable 'ansible_search_path' from source: unknown 12372 1727204082.42318: variable 'ansible_search_path' from source: unknown 12372 1727204082.42463: calling self._execute() 12372 1727204082.42484: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.42501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.42523: variable 'omit' from source: magic vars 12372 1727204082.43088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.45879: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.45973: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.46026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.46077: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.46114: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.46223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.46268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.46307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.46366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.46396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.46594: variable 'ansible_distribution' from source: facts 12372 1727204082.46598: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.46601: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.46610: when evaluation is False, skipping this task 12372 1727204082.46619: _execute() done 12372 1727204082.46694: dumping result to json 12372 1727204082.46697: done dumping result, returning 12372 1727204082.46701: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-244a-02f9-0000000000de] 12372 1727204082.46704: sending task result for task 12b410aa-8751-244a-02f9-0000000000de 12372 1727204082.46786: done sending task result for task 12b410aa-8751-244a-02f9-0000000000de 12372 1727204082.46792: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.46848: no more pending results, returning what we have 12372 1727204082.46852: results queue empty 12372 1727204082.46853: checking for any_errors_fatal 12372 1727204082.46862: done checking for any_errors_fatal 12372 1727204082.46863: checking for max_fail_percentage 12372 1727204082.46865: done checking for max_fail_percentage 12372 1727204082.46866: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.46867: done checking to see if all hosts have failed 12372 1727204082.46868: getting the remaining hosts for this loop 12372 1727204082.46870: done getting the remaining hosts for this loop 12372 1727204082.46875: getting the next task for host managed-node3 12372 1727204082.46885: done getting next task for host managed-node3 12372 1727204082.46891: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204082.46895: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.46919: getting variables 12372 1727204082.46921: in VariableManager get_vars() 12372 1727204082.46983: Calling all_inventory to load vars for managed-node3 12372 1727204082.46987: Calling groups_inventory to load vars for managed-node3 12372 1727204082.47196: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.47208: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.47212: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.47218: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.47636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.47943: done with get_vars() 12372 1727204082.47957: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.065) 0:00:09.466 ***** 12372 1727204082.48059: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204082.48355: worker is 1 (out of 1 available) 12372 1727204082.48369: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204082.48383: done queuing things up, now waiting for results queue to drain 12372 1727204082.48384: waiting for pending results... 12372 1727204082.48710: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204082.48919: in run() - task 12b410aa-8751-244a-02f9-0000000000df 12372 1727204082.48923: variable 'ansible_search_path' from source: unknown 12372 1727204082.48926: variable 'ansible_search_path' from source: unknown 12372 1727204082.48929: calling self._execute() 12372 1727204082.49008: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.49028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.49046: variable 'omit' from source: magic vars 12372 1727204082.49584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.52743: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.52846: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.52902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.52953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.53079: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.53098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.53142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.53178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.53244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.53267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.53442: variable 'ansible_distribution' from source: facts 12372 1727204082.53454: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.53470: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.53479: when evaluation is False, skipping this task 12372 1727204082.53486: _execute() done 12372 1727204082.53495: dumping result to json 12372 1727204082.53504: done dumping result, returning 12372 1727204082.53524: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-244a-02f9-0000000000df] 12372 1727204082.53534: sending task result for task 12b410aa-8751-244a-02f9-0000000000df 12372 1727204082.53895: done sending task result for task 12b410aa-8751-244a-02f9-0000000000df 12372 1727204082.53899: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.53949: no more pending results, returning what we have 12372 1727204082.53953: results queue empty 12372 1727204082.53954: checking for any_errors_fatal 12372 1727204082.53960: done checking for any_errors_fatal 12372 1727204082.53961: checking for max_fail_percentage 12372 1727204082.53963: done checking for max_fail_percentage 12372 1727204082.53964: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.53965: done checking to see if all hosts have failed 12372 1727204082.53966: getting the remaining hosts for this loop 12372 1727204082.53967: done getting the remaining hosts for this loop 12372 1727204082.53971: getting the next task for host managed-node3 12372 1727204082.53978: done getting next task for host managed-node3 12372 1727204082.53982: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204082.53986: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.54006: getting variables 12372 1727204082.54008: in VariableManager get_vars() 12372 1727204082.54064: Calling all_inventory to load vars for managed-node3 12372 1727204082.54068: Calling groups_inventory to load vars for managed-node3 12372 1727204082.54071: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.54081: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.54084: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.54088: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.54420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.54730: done with get_vars() 12372 1727204082.54743: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.067) 0:00:09.534 ***** 12372 1727204082.54843: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204082.55114: worker is 1 (out of 1 available) 12372 1727204082.55129: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204082.55142: done queuing things up, now waiting for results queue to drain 12372 1727204082.55144: waiting for pending results... 12372 1727204082.55443: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204082.55602: in run() - task 12b410aa-8751-244a-02f9-0000000000e0 12372 1727204082.55630: variable 'ansible_search_path' from source: unknown 12372 1727204082.55638: variable 'ansible_search_path' from source: unknown 12372 1727204082.55682: calling self._execute() 12372 1727204082.55784: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.55800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.55818: variable 'omit' from source: magic vars 12372 1727204082.56440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.59238: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.59279: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.59331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.59402: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.59453: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.59566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.59612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.59712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.59855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.59858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.60197: variable 'ansible_distribution' from source: facts 12372 1727204082.60205: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.60224: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.60227: when evaluation is False, skipping this task 12372 1727204082.60230: _execute() done 12372 1727204082.60236: dumping result to json 12372 1727204082.60240: done dumping result, returning 12372 1727204082.60252: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-244a-02f9-0000000000e0] 12372 1727204082.60268: sending task result for task 12b410aa-8751-244a-02f9-0000000000e0 12372 1727204082.60364: done sending task result for task 12b410aa-8751-244a-02f9-0000000000e0 12372 1727204082.60368: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.60428: no more pending results, returning what we have 12372 1727204082.60432: results queue empty 12372 1727204082.60433: checking for any_errors_fatal 12372 1727204082.60440: done checking for any_errors_fatal 12372 1727204082.60442: checking for max_fail_percentage 12372 1727204082.60443: done checking for max_fail_percentage 12372 1727204082.60444: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.60445: done checking to see if all hosts have failed 12372 1727204082.60446: getting the remaining hosts for this loop 12372 1727204082.60448: done getting the remaining hosts for this loop 12372 1727204082.60452: getting the next task for host managed-node3 12372 1727204082.60458: done getting next task for host managed-node3 12372 1727204082.60462: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204082.60465: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.60600: getting variables 12372 1727204082.60602: in VariableManager get_vars() 12372 1727204082.60658: Calling all_inventory to load vars for managed-node3 12372 1727204082.60662: Calling groups_inventory to load vars for managed-node3 12372 1727204082.60665: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.60675: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.60678: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.60682: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.60981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.61295: done with get_vars() 12372 1727204082.61309: done getting variables 12372 1727204082.61381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.065) 0:00:09.599 ***** 12372 1727204082.61421: entering _queue_task() for managed-node3/debug 12372 1727204082.61908: worker is 1 (out of 1 available) 12372 1727204082.61920: exiting _queue_task() for managed-node3/debug 12372 1727204082.61929: done queuing things up, now waiting for results queue to drain 12372 1727204082.61931: waiting for pending results... 12372 1727204082.62212: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204082.62217: in run() - task 12b410aa-8751-244a-02f9-0000000000e1 12372 1727204082.62221: variable 'ansible_search_path' from source: unknown 12372 1727204082.62223: variable 'ansible_search_path' from source: unknown 12372 1727204082.62245: calling self._execute() 12372 1727204082.62355: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.62370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.62388: variable 'omit' from source: magic vars 12372 1727204082.62935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.65798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.65802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.65818: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.65859: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.65897: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.66005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.66056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.66097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.66162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.66186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.66367: variable 'ansible_distribution' from source: facts 12372 1727204082.66380: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.66400: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.66431: when evaluation is False, skipping this task 12372 1727204082.66439: _execute() done 12372 1727204082.66453: dumping result to json 12372 1727204082.66464: done dumping result, returning 12372 1727204082.66478: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-244a-02f9-0000000000e1] 12372 1727204082.66564: sending task result for task 12b410aa-8751-244a-02f9-0000000000e1 12372 1727204082.66646: done sending task result for task 12b410aa-8751-244a-02f9-0000000000e1 12372 1727204082.66650: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204082.66730: no more pending results, returning what we have 12372 1727204082.66734: results queue empty 12372 1727204082.66735: checking for any_errors_fatal 12372 1727204082.66744: done checking for any_errors_fatal 12372 1727204082.66745: checking for max_fail_percentage 12372 1727204082.66747: done checking for max_fail_percentage 12372 1727204082.66749: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.66750: done checking to see if all hosts have failed 12372 1727204082.66751: getting the remaining hosts for this loop 12372 1727204082.66753: done getting the remaining hosts for this loop 12372 1727204082.66758: getting the next task for host managed-node3 12372 1727204082.66766: done getting next task for host managed-node3 12372 1727204082.66771: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204082.66786: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.66812: getting variables 12372 1727204082.66814: in VariableManager get_vars() 12372 1727204082.66877: Calling all_inventory to load vars for managed-node3 12372 1727204082.66881: Calling groups_inventory to load vars for managed-node3 12372 1727204082.67018: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.67030: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.67033: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.67038: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.67463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.67775: done with get_vars() 12372 1727204082.67788: done getting variables 12372 1727204082.67855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.064) 0:00:09.664 ***** 12372 1727204082.67903: entering _queue_task() for managed-node3/debug 12372 1727204082.68176: worker is 1 (out of 1 available) 12372 1727204082.68315: exiting _queue_task() for managed-node3/debug 12372 1727204082.68326: done queuing things up, now waiting for results queue to drain 12372 1727204082.68329: waiting for pending results... 12372 1727204082.68606: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204082.68687: in run() - task 12b410aa-8751-244a-02f9-0000000000e2 12372 1727204082.68712: variable 'ansible_search_path' from source: unknown 12372 1727204082.68755: variable 'ansible_search_path' from source: unknown 12372 1727204082.68779: calling self._execute() 12372 1727204082.68886: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.68901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.68918: variable 'omit' from source: magic vars 12372 1727204082.69625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.72410: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.72499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.72564: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.72612: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.72653: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.72758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.72801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.72844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.72902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.72925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.73163: variable 'ansible_distribution' from source: facts 12372 1727204082.73167: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.73169: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.73172: when evaluation is False, skipping this task 12372 1727204082.73174: _execute() done 12372 1727204082.73176: dumping result to json 12372 1727204082.73178: done dumping result, returning 12372 1727204082.73181: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-244a-02f9-0000000000e2] 12372 1727204082.73183: sending task result for task 12b410aa-8751-244a-02f9-0000000000e2 12372 1727204082.73463: done sending task result for task 12b410aa-8751-244a-02f9-0000000000e2 12372 1727204082.73466: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204082.73523: no more pending results, returning what we have 12372 1727204082.73527: results queue empty 12372 1727204082.73528: checking for any_errors_fatal 12372 1727204082.73534: done checking for any_errors_fatal 12372 1727204082.73536: checking for max_fail_percentage 12372 1727204082.73538: done checking for max_fail_percentage 12372 1727204082.73539: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.73540: done checking to see if all hosts have failed 12372 1727204082.73541: getting the remaining hosts for this loop 12372 1727204082.73543: done getting the remaining hosts for this loop 12372 1727204082.73550: getting the next task for host managed-node3 12372 1727204082.73557: done getting next task for host managed-node3 12372 1727204082.73562: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204082.73566: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.73588: getting variables 12372 1727204082.73592: in VariableManager get_vars() 12372 1727204082.73653: Calling all_inventory to load vars for managed-node3 12372 1727204082.73658: Calling groups_inventory to load vars for managed-node3 12372 1727204082.73661: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.73673: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.73676: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.73680: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.74085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.74398: done with get_vars() 12372 1727204082.74411: done getting variables 12372 1727204082.74481: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.066) 0:00:09.730 ***** 12372 1727204082.74524: entering _queue_task() for managed-node3/debug 12372 1727204082.74906: worker is 1 (out of 1 available) 12372 1727204082.74918: exiting _queue_task() for managed-node3/debug 12372 1727204082.74929: done queuing things up, now waiting for results queue to drain 12372 1727204082.74931: waiting for pending results... 12372 1727204082.75167: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204082.75285: in run() - task 12b410aa-8751-244a-02f9-0000000000e3 12372 1727204082.75309: variable 'ansible_search_path' from source: unknown 12372 1727204082.75332: variable 'ansible_search_path' from source: unknown 12372 1727204082.75364: calling self._execute() 12372 1727204082.75482: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.75486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.75500: variable 'omit' from source: magic vars 12372 1727204082.76095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.78763: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.78858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.78922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.78977: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.79015: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.79123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.79165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.79221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.79292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.79310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.79507: variable 'ansible_distribution' from source: facts 12372 1727204082.79510: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.79512: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.79519: when evaluation is False, skipping this task 12372 1727204082.79525: _execute() done 12372 1727204082.79534: dumping result to json 12372 1727204082.79542: done dumping result, returning 12372 1727204082.79558: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-244a-02f9-0000000000e3] 12372 1727204082.79615: sending task result for task 12b410aa-8751-244a-02f9-0000000000e3 12372 1727204082.79704: done sending task result for task 12b410aa-8751-244a-02f9-0000000000e3 12372 1727204082.79708: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204082.79774: no more pending results, returning what we have 12372 1727204082.79778: results queue empty 12372 1727204082.79779: checking for any_errors_fatal 12372 1727204082.79786: done checking for any_errors_fatal 12372 1727204082.79787: checking for max_fail_percentage 12372 1727204082.79790: done checking for max_fail_percentage 12372 1727204082.79792: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.79793: done checking to see if all hosts have failed 12372 1727204082.79794: getting the remaining hosts for this loop 12372 1727204082.79795: done getting the remaining hosts for this loop 12372 1727204082.79800: getting the next task for host managed-node3 12372 1727204082.79809: done getting next task for host managed-node3 12372 1727204082.79813: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204082.79817: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.79838: getting variables 12372 1727204082.79840: in VariableManager get_vars() 12372 1727204082.80016: Calling all_inventory to load vars for managed-node3 12372 1727204082.80020: Calling groups_inventory to load vars for managed-node3 12372 1727204082.80023: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.80035: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.80038: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.80041: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.80480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.80818: done with get_vars() 12372 1727204082.80830: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.064) 0:00:09.795 ***** 12372 1727204082.80939: entering _queue_task() for managed-node3/ping 12372 1727204082.81313: worker is 1 (out of 1 available) 12372 1727204082.81325: exiting _queue_task() for managed-node3/ping 12372 1727204082.81336: done queuing things up, now waiting for results queue to drain 12372 1727204082.81337: waiting for pending results... 12372 1727204082.81647: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204082.81698: in run() - task 12b410aa-8751-244a-02f9-0000000000e4 12372 1727204082.81720: variable 'ansible_search_path' from source: unknown 12372 1727204082.81730: variable 'ansible_search_path' from source: unknown 12372 1727204082.81782: calling self._execute() 12372 1727204082.81886: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.81907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.81960: variable 'omit' from source: magic vars 12372 1727204082.82599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.85696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.85835: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.85843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.85894: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.85931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.86033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.86082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.86119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.86182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.86272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.86372: variable 'ansible_distribution' from source: facts 12372 1727204082.86393: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.86412: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.86491: when evaluation is False, skipping this task 12372 1727204082.86495: _execute() done 12372 1727204082.86498: dumping result to json 12372 1727204082.86501: done dumping result, returning 12372 1727204082.86503: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-244a-02f9-0000000000e4] 12372 1727204082.86505: sending task result for task 12b410aa-8751-244a-02f9-0000000000e4 12372 1727204082.86575: done sending task result for task 12b410aa-8751-244a-02f9-0000000000e4 12372 1727204082.86578: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.86633: no more pending results, returning what we have 12372 1727204082.86637: results queue empty 12372 1727204082.86639: checking for any_errors_fatal 12372 1727204082.86649: done checking for any_errors_fatal 12372 1727204082.86650: checking for max_fail_percentage 12372 1727204082.86652: done checking for max_fail_percentage 12372 1727204082.86653: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.86654: done checking to see if all hosts have failed 12372 1727204082.86655: getting the remaining hosts for this loop 12372 1727204082.86657: done getting the remaining hosts for this loop 12372 1727204082.86662: getting the next task for host managed-node3 12372 1727204082.86673: done getting next task for host managed-node3 12372 1727204082.86676: ^ task is: TASK: meta (role_complete) 12372 1727204082.86680: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.86703: getting variables 12372 1727204082.86705: in VariableManager get_vars() 12372 1727204082.86769: Calling all_inventory to load vars for managed-node3 12372 1727204082.86773: Calling groups_inventory to load vars for managed-node3 12372 1727204082.86776: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.86787: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.86995: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.87001: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.87308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.87620: done with get_vars() 12372 1727204082.87634: done getting variables 12372 1727204082.87734: done queuing things up, now waiting for results queue to drain 12372 1727204082.87737: results queue empty 12372 1727204082.87738: checking for any_errors_fatal 12372 1727204082.87741: done checking for any_errors_fatal 12372 1727204082.87742: checking for max_fail_percentage 12372 1727204082.87743: done checking for max_fail_percentage 12372 1727204082.87744: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.87745: done checking to see if all hosts have failed 12372 1727204082.87746: getting the remaining hosts for this loop 12372 1727204082.87747: done getting the remaining hosts for this loop 12372 1727204082.87750: getting the next task for host managed-node3 12372 1727204082.87756: done getting next task for host managed-node3 12372 1727204082.87759: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204082.87762: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.87778: getting variables 12372 1727204082.87779: in VariableManager get_vars() 12372 1727204082.87808: Calling all_inventory to load vars for managed-node3 12372 1727204082.87811: Calling groups_inventory to load vars for managed-node3 12372 1727204082.87814: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.87824: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.87827: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.87831: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.88036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.88372: done with get_vars() 12372 1727204082.88382: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.075) 0:00:09.870 ***** 12372 1727204082.88471: entering _queue_task() for managed-node3/include_tasks 12372 1727204082.88746: worker is 1 (out of 1 available) 12372 1727204082.88874: exiting _queue_task() for managed-node3/include_tasks 12372 1727204082.88887: done queuing things up, now waiting for results queue to drain 12372 1727204082.88891: waiting for pending results... 12372 1727204082.89210: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204082.89242: in run() - task 12b410aa-8751-244a-02f9-00000000011b 12372 1727204082.89309: variable 'ansible_search_path' from source: unknown 12372 1727204082.89313: variable 'ansible_search_path' from source: unknown 12372 1727204082.89318: calling self._execute() 12372 1727204082.89420: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.89434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.89449: variable 'omit' from source: magic vars 12372 1727204082.89976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.92645: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.92800: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.92803: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.92851: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.92897: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.93006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.93057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.93098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.93172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.93239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.93379: variable 'ansible_distribution' from source: facts 12372 1727204082.93396: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.93415: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.93424: when evaluation is False, skipping this task 12372 1727204082.93432: _execute() done 12372 1727204082.93453: dumping result to json 12372 1727204082.93457: done dumping result, returning 12372 1727204082.93563: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-244a-02f9-00000000011b] 12372 1727204082.93567: sending task result for task 12b410aa-8751-244a-02f9-00000000011b 12372 1727204082.93645: done sending task result for task 12b410aa-8751-244a-02f9-00000000011b 12372 1727204082.93649: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204082.93724: no more pending results, returning what we have 12372 1727204082.93729: results queue empty 12372 1727204082.93730: checking for any_errors_fatal 12372 1727204082.93732: done checking for any_errors_fatal 12372 1727204082.93733: checking for max_fail_percentage 12372 1727204082.93735: done checking for max_fail_percentage 12372 1727204082.93736: checking to see if all hosts have failed and the running result is not ok 12372 1727204082.93737: done checking to see if all hosts have failed 12372 1727204082.93738: getting the remaining hosts for this loop 12372 1727204082.93740: done getting the remaining hosts for this loop 12372 1727204082.93745: getting the next task for host managed-node3 12372 1727204082.93753: done getting next task for host managed-node3 12372 1727204082.93758: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204082.93761: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204082.94003: getting variables 12372 1727204082.94005: in VariableManager get_vars() 12372 1727204082.94061: Calling all_inventory to load vars for managed-node3 12372 1727204082.94064: Calling groups_inventory to load vars for managed-node3 12372 1727204082.94067: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204082.94077: Calling all_plugins_play to load vars for managed-node3 12372 1727204082.94081: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204082.94085: Calling groups_plugins_play to load vars for managed-node3 12372 1727204082.94341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204082.94652: done with get_vars() 12372 1727204082.94665: done getting variables 12372 1727204082.94728: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.062) 0:00:09.933 ***** 12372 1727204082.94769: entering _queue_task() for managed-node3/debug 12372 1727204082.95026: worker is 1 (out of 1 available) 12372 1727204082.95041: exiting _queue_task() for managed-node3/debug 12372 1727204082.95053: done queuing things up, now waiting for results queue to drain 12372 1727204082.95055: waiting for pending results... 12372 1727204082.95422: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204082.95519: in run() - task 12b410aa-8751-244a-02f9-00000000011c 12372 1727204082.95541: variable 'ansible_search_path' from source: unknown 12372 1727204082.95551: variable 'ansible_search_path' from source: unknown 12372 1727204082.95598: calling self._execute() 12372 1727204082.95700: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204082.95716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204082.95740: variable 'omit' from source: magic vars 12372 1727204082.96390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204082.99121: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204082.99207: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204082.99262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204082.99326: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204082.99357: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204082.99495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204082.99502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204082.99543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204082.99605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204082.99627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204082.99806: variable 'ansible_distribution' from source: facts 12372 1727204082.99867: variable 'ansible_distribution_major_version' from source: facts 12372 1727204082.99871: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204082.99874: when evaluation is False, skipping this task 12372 1727204082.99877: _execute() done 12372 1727204082.99879: dumping result to json 12372 1727204082.99881: done dumping result, returning 12372 1727204082.99884: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-244a-02f9-00000000011c] 12372 1727204082.99886: sending task result for task 12b410aa-8751-244a-02f9-00000000011c 12372 1727204083.00077: done sending task result for task 12b410aa-8751-244a-02f9-00000000011c 12372 1727204083.00081: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204083.00160: no more pending results, returning what we have 12372 1727204083.00164: results queue empty 12372 1727204083.00165: checking for any_errors_fatal 12372 1727204083.00171: done checking for any_errors_fatal 12372 1727204083.00172: checking for max_fail_percentage 12372 1727204083.00175: done checking for max_fail_percentage 12372 1727204083.00176: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.00177: done checking to see if all hosts have failed 12372 1727204083.00178: getting the remaining hosts for this loop 12372 1727204083.00180: done getting the remaining hosts for this loop 12372 1727204083.00185: getting the next task for host managed-node3 12372 1727204083.00194: done getting next task for host managed-node3 12372 1727204083.00199: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204083.00202: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.00224: getting variables 12372 1727204083.00226: in VariableManager get_vars() 12372 1727204083.00288: Calling all_inventory to load vars for managed-node3 12372 1727204083.00478: Calling groups_inventory to load vars for managed-node3 12372 1727204083.00482: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.00495: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.00498: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.00507: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.00798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.01112: done with get_vars() 12372 1727204083.01125: done getting variables 12372 1727204083.01198: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.064) 0:00:09.998 ***** 12372 1727204083.01235: entering _queue_task() for managed-node3/fail 12372 1727204083.01627: worker is 1 (out of 1 available) 12372 1727204083.01640: exiting _queue_task() for managed-node3/fail 12372 1727204083.01653: done queuing things up, now waiting for results queue to drain 12372 1727204083.01655: waiting for pending results... 12372 1727204083.01931: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204083.02136: in run() - task 12b410aa-8751-244a-02f9-00000000011d 12372 1727204083.02141: variable 'ansible_search_path' from source: unknown 12372 1727204083.02144: variable 'ansible_search_path' from source: unknown 12372 1727204083.02146: calling self._execute() 12372 1727204083.02196: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.02210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.02226: variable 'omit' from source: magic vars 12372 1727204083.02762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.05746: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.05845: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.05887: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.05938: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.05970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.06073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.06109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.06148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.06199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.06277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.06384: variable 'ansible_distribution' from source: facts 12372 1727204083.06393: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.06406: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.06410: when evaluation is False, skipping this task 12372 1727204083.06413: _execute() done 12372 1727204083.06418: dumping result to json 12372 1727204083.06426: done dumping result, returning 12372 1727204083.06437: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-244a-02f9-00000000011d] 12372 1727204083.06443: sending task result for task 12b410aa-8751-244a-02f9-00000000011d 12372 1727204083.06554: done sending task result for task 12b410aa-8751-244a-02f9-00000000011d 12372 1727204083.06557: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.06611: no more pending results, returning what we have 12372 1727204083.06616: results queue empty 12372 1727204083.06617: checking for any_errors_fatal 12372 1727204083.06623: done checking for any_errors_fatal 12372 1727204083.06624: checking for max_fail_percentage 12372 1727204083.06626: done checking for max_fail_percentage 12372 1727204083.06627: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.06628: done checking to see if all hosts have failed 12372 1727204083.06628: getting the remaining hosts for this loop 12372 1727204083.06630: done getting the remaining hosts for this loop 12372 1727204083.06635: getting the next task for host managed-node3 12372 1727204083.06642: done getting next task for host managed-node3 12372 1727204083.06646: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204083.06649: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.06671: getting variables 12372 1727204083.06673: in VariableManager get_vars() 12372 1727204083.06729: Calling all_inventory to load vars for managed-node3 12372 1727204083.06732: Calling groups_inventory to load vars for managed-node3 12372 1727204083.06735: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.06744: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.06748: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.06752: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.07084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.07404: done with get_vars() 12372 1727204083.07421: done getting variables 12372 1727204083.07498: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.063) 0:00:10.061 ***** 12372 1727204083.07541: entering _queue_task() for managed-node3/fail 12372 1727204083.07885: worker is 1 (out of 1 available) 12372 1727204083.08013: exiting _queue_task() for managed-node3/fail 12372 1727204083.08033: done queuing things up, now waiting for results queue to drain 12372 1727204083.08035: waiting for pending results... 12372 1727204083.08509: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204083.08520: in run() - task 12b410aa-8751-244a-02f9-00000000011e 12372 1727204083.08524: variable 'ansible_search_path' from source: unknown 12372 1727204083.08529: variable 'ansible_search_path' from source: unknown 12372 1727204083.08577: calling self._execute() 12372 1727204083.08715: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.08726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.08751: variable 'omit' from source: magic vars 12372 1727204083.09454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.12881: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.12971: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.13043: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.13062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.13094: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.13200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.13241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.13494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.13498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.13501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.13523: variable 'ansible_distribution' from source: facts 12372 1727204083.13532: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.13546: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.13549: when evaluation is False, skipping this task 12372 1727204083.13554: _execute() done 12372 1727204083.13558: dumping result to json 12372 1727204083.13564: done dumping result, returning 12372 1727204083.13575: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-244a-02f9-00000000011e] 12372 1727204083.13586: sending task result for task 12b410aa-8751-244a-02f9-00000000011e skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.13754: no more pending results, returning what we have 12372 1727204083.13758: results queue empty 12372 1727204083.13759: checking for any_errors_fatal 12372 1727204083.13766: done checking for any_errors_fatal 12372 1727204083.13767: checking for max_fail_percentage 12372 1727204083.13769: done checking for max_fail_percentage 12372 1727204083.13770: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.13772: done checking to see if all hosts have failed 12372 1727204083.13773: getting the remaining hosts for this loop 12372 1727204083.13775: done getting the remaining hosts for this loop 12372 1727204083.13780: getting the next task for host managed-node3 12372 1727204083.13787: done getting next task for host managed-node3 12372 1727204083.13794: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204083.13798: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.13827: getting variables 12372 1727204083.13830: in VariableManager get_vars() 12372 1727204083.14141: Calling all_inventory to load vars for managed-node3 12372 1727204083.14146: Calling groups_inventory to load vars for managed-node3 12372 1727204083.14149: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.14161: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.14164: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.14169: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.14437: done sending task result for task 12b410aa-8751-244a-02f9-00000000011e 12372 1727204083.14442: WORKER PROCESS EXITING 12372 1727204083.14469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.14786: done with get_vars() 12372 1727204083.14801: done getting variables 12372 1727204083.14872: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.073) 0:00:10.134 ***** 12372 1727204083.14912: entering _queue_task() for managed-node3/fail 12372 1727204083.15207: worker is 1 (out of 1 available) 12372 1727204083.15222: exiting _queue_task() for managed-node3/fail 12372 1727204083.15235: done queuing things up, now waiting for results queue to drain 12372 1727204083.15237: waiting for pending results... 12372 1727204083.15593: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204083.15708: in run() - task 12b410aa-8751-244a-02f9-00000000011f 12372 1727204083.15729: variable 'ansible_search_path' from source: unknown 12372 1727204083.15738: variable 'ansible_search_path' from source: unknown 12372 1727204083.15781: calling self._execute() 12372 1727204083.15909: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.15914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.15923: variable 'omit' from source: magic vars 12372 1727204083.16474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.19397: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.19782: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.19827: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.19868: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.19906: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.20008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.20053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.20084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.20144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.20162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.20338: variable 'ansible_distribution' from source: facts 12372 1727204083.20345: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.20359: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.20362: when evaluation is False, skipping this task 12372 1727204083.20365: _execute() done 12372 1727204083.20389: dumping result to json 12372 1727204083.20392: done dumping result, returning 12372 1727204083.20397: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-244a-02f9-00000000011f] 12372 1727204083.20399: sending task result for task 12b410aa-8751-244a-02f9-00000000011f 12372 1727204083.20669: done sending task result for task 12b410aa-8751-244a-02f9-00000000011f 12372 1727204083.20672: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.20719: no more pending results, returning what we have 12372 1727204083.20723: results queue empty 12372 1727204083.20724: checking for any_errors_fatal 12372 1727204083.20731: done checking for any_errors_fatal 12372 1727204083.20732: checking for max_fail_percentage 12372 1727204083.20734: done checking for max_fail_percentage 12372 1727204083.20735: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.20736: done checking to see if all hosts have failed 12372 1727204083.20737: getting the remaining hosts for this loop 12372 1727204083.20738: done getting the remaining hosts for this loop 12372 1727204083.20742: getting the next task for host managed-node3 12372 1727204083.20748: done getting next task for host managed-node3 12372 1727204083.20752: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204083.20755: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.20771: getting variables 12372 1727204083.20773: in VariableManager get_vars() 12372 1727204083.20831: Calling all_inventory to load vars for managed-node3 12372 1727204083.20835: Calling groups_inventory to load vars for managed-node3 12372 1727204083.20838: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.20848: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.20852: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.20856: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.21099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.21410: done with get_vars() 12372 1727204083.21425: done getting variables 12372 1727204083.21506: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.066) 0:00:10.201 ***** 12372 1727204083.21543: entering _queue_task() for managed-node3/dnf 12372 1727204083.21920: worker is 1 (out of 1 available) 12372 1727204083.21934: exiting _queue_task() for managed-node3/dnf 12372 1727204083.21946: done queuing things up, now waiting for results queue to drain 12372 1727204083.21948: waiting for pending results... 12372 1727204083.22313: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204083.22396: in run() - task 12b410aa-8751-244a-02f9-000000000120 12372 1727204083.22431: variable 'ansible_search_path' from source: unknown 12372 1727204083.22435: variable 'ansible_search_path' from source: unknown 12372 1727204083.22516: calling self._execute() 12372 1727204083.22585: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.22605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.22628: variable 'omit' from source: magic vars 12372 1727204083.23218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.28115: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.28245: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.28274: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.28327: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.28398: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.28484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.28534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.28578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.28681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.28685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.28854: variable 'ansible_distribution' from source: facts 12372 1727204083.28869: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.28894: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.28905: when evaluation is False, skipping this task 12372 1727204083.28942: _execute() done 12372 1727204083.28945: dumping result to json 12372 1727204083.28948: done dumping result, returning 12372 1727204083.28951: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000120] 12372 1727204083.28955: sending task result for task 12b410aa-8751-244a-02f9-000000000120 12372 1727204083.29232: done sending task result for task 12b410aa-8751-244a-02f9-000000000120 12372 1727204083.29235: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.29298: no more pending results, returning what we have 12372 1727204083.29303: results queue empty 12372 1727204083.29304: checking for any_errors_fatal 12372 1727204083.29312: done checking for any_errors_fatal 12372 1727204083.29313: checking for max_fail_percentage 12372 1727204083.29315: done checking for max_fail_percentage 12372 1727204083.29316: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.29317: done checking to see if all hosts have failed 12372 1727204083.29317: getting the remaining hosts for this loop 12372 1727204083.29319: done getting the remaining hosts for this loop 12372 1727204083.29325: getting the next task for host managed-node3 12372 1727204083.29333: done getting next task for host managed-node3 12372 1727204083.29337: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204083.29341: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.29366: getting variables 12372 1727204083.29368: in VariableManager get_vars() 12372 1727204083.29565: Calling all_inventory to load vars for managed-node3 12372 1727204083.29570: Calling groups_inventory to load vars for managed-node3 12372 1727204083.29573: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.29583: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.29587: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.29700: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.30155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.30465: done with get_vars() 12372 1727204083.30485: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204083.30577: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.090) 0:00:10.291 ***** 12372 1727204083.30622: entering _queue_task() for managed-node3/yum 12372 1727204083.31007: worker is 1 (out of 1 available) 12372 1727204083.31024: exiting _queue_task() for managed-node3/yum 12372 1727204083.31036: done queuing things up, now waiting for results queue to drain 12372 1727204083.31038: waiting for pending results... 12372 1727204083.31308: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204083.31466: in run() - task 12b410aa-8751-244a-02f9-000000000121 12372 1727204083.31494: variable 'ansible_search_path' from source: unknown 12372 1727204083.31500: variable 'ansible_search_path' from source: unknown 12372 1727204083.31575: calling self._execute() 12372 1727204083.31643: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.31659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.31686: variable 'omit' from source: magic vars 12372 1727204083.32244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.35595: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.35600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.35632: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.35679: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.35724: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.35830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.35873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.35914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.35981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.36006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.36263: variable 'ansible_distribution' from source: facts 12372 1727204083.36267: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.36269: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.36271: when evaluation is False, skipping this task 12372 1727204083.36274: _execute() done 12372 1727204083.36276: dumping result to json 12372 1727204083.36278: done dumping result, returning 12372 1727204083.36280: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000121] 12372 1727204083.36282: sending task result for task 12b410aa-8751-244a-02f9-000000000121 12372 1727204083.36361: done sending task result for task 12b410aa-8751-244a-02f9-000000000121 12372 1727204083.36476: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.36537: no more pending results, returning what we have 12372 1727204083.36541: results queue empty 12372 1727204083.36542: checking for any_errors_fatal 12372 1727204083.36548: done checking for any_errors_fatal 12372 1727204083.36549: checking for max_fail_percentage 12372 1727204083.36551: done checking for max_fail_percentage 12372 1727204083.36552: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.36553: done checking to see if all hosts have failed 12372 1727204083.36554: getting the remaining hosts for this loop 12372 1727204083.36555: done getting the remaining hosts for this loop 12372 1727204083.36560: getting the next task for host managed-node3 12372 1727204083.36568: done getting next task for host managed-node3 12372 1727204083.36573: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204083.36576: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.36599: getting variables 12372 1727204083.36601: in VariableManager get_vars() 12372 1727204083.36661: Calling all_inventory to load vars for managed-node3 12372 1727204083.36665: Calling groups_inventory to load vars for managed-node3 12372 1727204083.36668: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.36679: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.36682: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.36686: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.37100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.37415: done with get_vars() 12372 1727204083.37428: done getting variables 12372 1727204083.37504: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.069) 0:00:10.361 ***** 12372 1727204083.37543: entering _queue_task() for managed-node3/fail 12372 1727204083.37855: worker is 1 (out of 1 available) 12372 1727204083.37868: exiting _queue_task() for managed-node3/fail 12372 1727204083.37881: done queuing things up, now waiting for results queue to drain 12372 1727204083.37883: waiting for pending results... 12372 1727204083.38187: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204083.38349: in run() - task 12b410aa-8751-244a-02f9-000000000122 12372 1727204083.38445: variable 'ansible_search_path' from source: unknown 12372 1727204083.38449: variable 'ansible_search_path' from source: unknown 12372 1727204083.38452: calling self._execute() 12372 1727204083.38523: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.38538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.38561: variable 'omit' from source: magic vars 12372 1727204083.39176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.42085: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.42233: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.42518: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.42522: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.42796: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.42924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.42995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.43034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.43103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.43128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.43318: variable 'ansible_distribution' from source: facts 12372 1727204083.43332: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.43349: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.43357: when evaluation is False, skipping this task 12372 1727204083.43364: _execute() done 12372 1727204083.43372: dumping result to json 12372 1727204083.43388: done dumping result, returning 12372 1727204083.43408: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000122] 12372 1727204083.43420: sending task result for task 12b410aa-8751-244a-02f9-000000000122 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.43706: no more pending results, returning what we have 12372 1727204083.43710: results queue empty 12372 1727204083.43711: checking for any_errors_fatal 12372 1727204083.43721: done checking for any_errors_fatal 12372 1727204083.43722: checking for max_fail_percentage 12372 1727204083.43724: done checking for max_fail_percentage 12372 1727204083.43725: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.43726: done checking to see if all hosts have failed 12372 1727204083.43727: getting the remaining hosts for this loop 12372 1727204083.43729: done getting the remaining hosts for this loop 12372 1727204083.43734: getting the next task for host managed-node3 12372 1727204083.43742: done getting next task for host managed-node3 12372 1727204083.43747: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12372 1727204083.43750: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.43996: done sending task result for task 12b410aa-8751-244a-02f9-000000000122 12372 1727204083.43999: WORKER PROCESS EXITING 12372 1727204083.44014: getting variables 12372 1727204083.44016: in VariableManager get_vars() 12372 1727204083.44073: Calling all_inventory to load vars for managed-node3 12372 1727204083.44077: Calling groups_inventory to load vars for managed-node3 12372 1727204083.44080: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.44115: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.44120: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.44125: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.44487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.44772: done with get_vars() 12372 1727204083.44785: done getting variables 12372 1727204083.44856: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.073) 0:00:10.434 ***** 12372 1727204083.44904: entering _queue_task() for managed-node3/package 12372 1727204083.45200: worker is 1 (out of 1 available) 12372 1727204083.45214: exiting _queue_task() for managed-node3/package 12372 1727204083.45227: done queuing things up, now waiting for results queue to drain 12372 1727204083.45229: waiting for pending results... 12372 1727204083.45610: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 12372 1727204083.45674: in run() - task 12b410aa-8751-244a-02f9-000000000123 12372 1727204083.45698: variable 'ansible_search_path' from source: unknown 12372 1727204083.45712: variable 'ansible_search_path' from source: unknown 12372 1727204083.45760: calling self._execute() 12372 1727204083.45868: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.45882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.45900: variable 'omit' from source: magic vars 12372 1727204083.46456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.49446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.49539: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.49588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.49791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.49796: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.49800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.49803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.49839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.49897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.49921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.50233: variable 'ansible_distribution' from source: facts 12372 1727204083.50305: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.50323: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.50331: when evaluation is False, skipping this task 12372 1727204083.50338: _execute() done 12372 1727204083.50347: dumping result to json 12372 1727204083.50696: done dumping result, returning 12372 1727204083.50699: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-244a-02f9-000000000123] 12372 1727204083.50702: sending task result for task 12b410aa-8751-244a-02f9-000000000123 12372 1727204083.50782: done sending task result for task 12b410aa-8751-244a-02f9-000000000123 12372 1727204083.50785: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.51038: no more pending results, returning what we have 12372 1727204083.51042: results queue empty 12372 1727204083.51043: checking for any_errors_fatal 12372 1727204083.51050: done checking for any_errors_fatal 12372 1727204083.51051: checking for max_fail_percentage 12372 1727204083.51052: done checking for max_fail_percentage 12372 1727204083.51054: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.51055: done checking to see if all hosts have failed 12372 1727204083.51056: getting the remaining hosts for this loop 12372 1727204083.51057: done getting the remaining hosts for this loop 12372 1727204083.51061: getting the next task for host managed-node3 12372 1727204083.51067: done getting next task for host managed-node3 12372 1727204083.51071: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204083.51074: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.51096: getting variables 12372 1727204083.51098: in VariableManager get_vars() 12372 1727204083.51152: Calling all_inventory to load vars for managed-node3 12372 1727204083.51155: Calling groups_inventory to load vars for managed-node3 12372 1727204083.51158: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.51168: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.51172: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.51175: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.51664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.52373: done with get_vars() 12372 1727204083.52386: done getting variables 12372 1727204083.52455: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.075) 0:00:10.510 ***** 12372 1727204083.52539: entering _queue_task() for managed-node3/package 12372 1727204083.53018: worker is 1 (out of 1 available) 12372 1727204083.53031: exiting _queue_task() for managed-node3/package 12372 1727204083.53043: done queuing things up, now waiting for results queue to drain 12372 1727204083.53045: waiting for pending results... 12372 1727204083.53757: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204083.53854: in run() - task 12b410aa-8751-244a-02f9-000000000124 12372 1727204083.53859: variable 'ansible_search_path' from source: unknown 12372 1727204083.53862: variable 'ansible_search_path' from source: unknown 12372 1727204083.53865: calling self._execute() 12372 1727204083.53976: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.53980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.53983: variable 'omit' from source: magic vars 12372 1727204083.55029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.58638: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.58730: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.58777: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.58822: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.58851: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.58961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.59004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.59172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.59224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.59242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.59497: variable 'ansible_distribution' from source: facts 12372 1727204083.59501: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.59504: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.59506: when evaluation is False, skipping this task 12372 1727204083.59509: _execute() done 12372 1727204083.59511: dumping result to json 12372 1727204083.59513: done dumping result, returning 12372 1727204083.59518: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-244a-02f9-000000000124] 12372 1727204083.59521: sending task result for task 12b410aa-8751-244a-02f9-000000000124 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.59657: no more pending results, returning what we have 12372 1727204083.59661: results queue empty 12372 1727204083.59663: checking for any_errors_fatal 12372 1727204083.59671: done checking for any_errors_fatal 12372 1727204083.59672: checking for max_fail_percentage 12372 1727204083.59674: done checking for max_fail_percentage 12372 1727204083.59675: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.59677: done checking to see if all hosts have failed 12372 1727204083.59678: getting the remaining hosts for this loop 12372 1727204083.59679: done getting the remaining hosts for this loop 12372 1727204083.59684: getting the next task for host managed-node3 12372 1727204083.59695: done getting next task for host managed-node3 12372 1727204083.59699: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204083.59702: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.59725: getting variables 12372 1727204083.59727: in VariableManager get_vars() 12372 1727204083.60508: Calling all_inventory to load vars for managed-node3 12372 1727204083.60512: Calling groups_inventory to load vars for managed-node3 12372 1727204083.60518: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.60529: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.60532: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.60537: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.60819: done sending task result for task 12b410aa-8751-244a-02f9-000000000124 12372 1727204083.60823: WORKER PROCESS EXITING 12372 1727204083.60855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.61146: done with get_vars() 12372 1727204083.61159: done getting variables 12372 1727204083.61231: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.087) 0:00:10.598 ***** 12372 1727204083.61266: entering _queue_task() for managed-node3/package 12372 1727204083.61570: worker is 1 (out of 1 available) 12372 1727204083.61582: exiting _queue_task() for managed-node3/package 12372 1727204083.61798: done queuing things up, now waiting for results queue to drain 12372 1727204083.61800: waiting for pending results... 12372 1727204083.61913: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204083.62094: in run() - task 12b410aa-8751-244a-02f9-000000000125 12372 1727204083.62122: variable 'ansible_search_path' from source: unknown 12372 1727204083.62191: variable 'ansible_search_path' from source: unknown 12372 1727204083.62199: calling self._execute() 12372 1727204083.62283: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.62303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.62326: variable 'omit' from source: magic vars 12372 1727204083.62902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.68846: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.68851: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.68962: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.69041: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.69129: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.69386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.69437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.69477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.69797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.69801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.69992: variable 'ansible_distribution' from source: facts 12372 1727204083.70155: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.70174: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.70183: when evaluation is False, skipping this task 12372 1727204083.70208: _execute() done 12372 1727204083.70228: dumping result to json 12372 1727204083.70242: done dumping result, returning 12372 1727204083.70345: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-244a-02f9-000000000125] 12372 1727204083.70350: sending task result for task 12b410aa-8751-244a-02f9-000000000125 12372 1727204083.70809: done sending task result for task 12b410aa-8751-244a-02f9-000000000125 12372 1727204083.70814: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.70876: no more pending results, returning what we have 12372 1727204083.70880: results queue empty 12372 1727204083.70881: checking for any_errors_fatal 12372 1727204083.70893: done checking for any_errors_fatal 12372 1727204083.70894: checking for max_fail_percentage 12372 1727204083.70896: done checking for max_fail_percentage 12372 1727204083.70898: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.70899: done checking to see if all hosts have failed 12372 1727204083.70900: getting the remaining hosts for this loop 12372 1727204083.70901: done getting the remaining hosts for this loop 12372 1727204083.70906: getting the next task for host managed-node3 12372 1727204083.70913: done getting next task for host managed-node3 12372 1727204083.70920: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204083.70924: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.70944: getting variables 12372 1727204083.70946: in VariableManager get_vars() 12372 1727204083.71021: Calling all_inventory to load vars for managed-node3 12372 1727204083.71025: Calling groups_inventory to load vars for managed-node3 12372 1727204083.71028: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.71039: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.71043: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.71047: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.71620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.72364: done with get_vars() 12372 1727204083.72378: done getting variables 12372 1727204083.72453: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.112) 0:00:10.710 ***** 12372 1727204083.72696: entering _queue_task() for managed-node3/service 12372 1727204083.73191: worker is 1 (out of 1 available) 12372 1727204083.73206: exiting _queue_task() for managed-node3/service 12372 1727204083.73224: done queuing things up, now waiting for results queue to drain 12372 1727204083.73226: waiting for pending results... 12372 1727204083.73806: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204083.74098: in run() - task 12b410aa-8751-244a-02f9-000000000126 12372 1727204083.74297: variable 'ansible_search_path' from source: unknown 12372 1727204083.74301: variable 'ansible_search_path' from source: unknown 12372 1727204083.74305: calling self._execute() 12372 1727204083.74494: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.74498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.74501: variable 'omit' from source: magic vars 12372 1727204083.75634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.81606: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.81818: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.81863: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.82032: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.82063: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.82274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.82414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.82453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.82506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.82526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.82864: variable 'ansible_distribution' from source: facts 12372 1727204083.82868: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.82881: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.82884: when evaluation is False, skipping this task 12372 1727204083.82887: _execute() done 12372 1727204083.82893: dumping result to json 12372 1727204083.83115: done dumping result, returning 12372 1727204083.83128: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000126] 12372 1727204083.83135: sending task result for task 12b410aa-8751-244a-02f9-000000000126 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204083.83354: no more pending results, returning what we have 12372 1727204083.83358: results queue empty 12372 1727204083.83359: checking for any_errors_fatal 12372 1727204083.83365: done checking for any_errors_fatal 12372 1727204083.83366: checking for max_fail_percentage 12372 1727204083.83368: done checking for max_fail_percentage 12372 1727204083.83368: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.83369: done checking to see if all hosts have failed 12372 1727204083.83370: getting the remaining hosts for this loop 12372 1727204083.83372: done getting the remaining hosts for this loop 12372 1727204083.83377: getting the next task for host managed-node3 12372 1727204083.83384: done getting next task for host managed-node3 12372 1727204083.83388: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204083.83393: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.83418: getting variables 12372 1727204083.83421: in VariableManager get_vars() 12372 1727204083.83488: Calling all_inventory to load vars for managed-node3 12372 1727204083.83497: Calling groups_inventory to load vars for managed-node3 12372 1727204083.83501: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.83613: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.83618: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.83621: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.84091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.84659: done sending task result for task 12b410aa-8751-244a-02f9-000000000126 12372 1727204083.84662: WORKER PROCESS EXITING 12372 1727204083.84803: done with get_vars() 12372 1727204083.84817: done getting variables 12372 1727204083.84887: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.125) 0:00:10.835 ***** 12372 1727204083.85025: entering _queue_task() for managed-node3/service 12372 1727204083.85512: worker is 1 (out of 1 available) 12372 1727204083.85528: exiting _queue_task() for managed-node3/service 12372 1727204083.85542: done queuing things up, now waiting for results queue to drain 12372 1727204083.85544: waiting for pending results... 12372 1727204083.86085: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204083.86429: in run() - task 12b410aa-8751-244a-02f9-000000000127 12372 1727204083.86446: variable 'ansible_search_path' from source: unknown 12372 1727204083.86450: variable 'ansible_search_path' from source: unknown 12372 1727204083.86490: calling self._execute() 12372 1727204083.86577: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.86586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.86802: variable 'omit' from source: magic vars 12372 1727204083.87702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204083.92931: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204083.92988: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204083.93048: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204083.93101: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204083.93141: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204083.93242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204083.93281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204083.93324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204083.93379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204083.93402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204083.93575: variable 'ansible_distribution' from source: facts 12372 1727204083.93587: variable 'ansible_distribution_major_version' from source: facts 12372 1727204083.93608: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204083.93619: when evaluation is False, skipping this task 12372 1727204083.93628: _execute() done 12372 1727204083.93635: dumping result to json 12372 1727204083.93643: done dumping result, returning 12372 1727204083.93656: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-244a-02f9-000000000127] 12372 1727204083.93665: sending task result for task 12b410aa-8751-244a-02f9-000000000127 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204083.93827: no more pending results, returning what we have 12372 1727204083.93831: results queue empty 12372 1727204083.93832: checking for any_errors_fatal 12372 1727204083.93840: done checking for any_errors_fatal 12372 1727204083.93841: checking for max_fail_percentage 12372 1727204083.93843: done checking for max_fail_percentage 12372 1727204083.93843: checking to see if all hosts have failed and the running result is not ok 12372 1727204083.93844: done checking to see if all hosts have failed 12372 1727204083.93845: getting the remaining hosts for this loop 12372 1727204083.93847: done getting the remaining hosts for this loop 12372 1727204083.93852: getting the next task for host managed-node3 12372 1727204083.93858: done getting next task for host managed-node3 12372 1727204083.93863: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204083.93866: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204083.93888: getting variables 12372 1727204083.93892: in VariableManager get_vars() 12372 1727204083.93955: Calling all_inventory to load vars for managed-node3 12372 1727204083.93958: Calling groups_inventory to load vars for managed-node3 12372 1727204083.93961: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204083.93972: Calling all_plugins_play to load vars for managed-node3 12372 1727204083.93975: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204083.93978: Calling groups_plugins_play to load vars for managed-node3 12372 1727204083.94407: done sending task result for task 12b410aa-8751-244a-02f9-000000000127 12372 1727204083.94410: WORKER PROCESS EXITING 12372 1727204083.94442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204083.94811: done with get_vars() 12372 1727204083.94826: done getting variables 12372 1727204083.94897: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.099) 0:00:10.935 ***** 12372 1727204083.94940: entering _queue_task() for managed-node3/service 12372 1727204083.95255: worker is 1 (out of 1 available) 12372 1727204083.95269: exiting _queue_task() for managed-node3/service 12372 1727204083.95295: done queuing things up, now waiting for results queue to drain 12372 1727204083.95297: waiting for pending results... 12372 1727204083.95609: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204083.95758: in run() - task 12b410aa-8751-244a-02f9-000000000128 12372 1727204083.95773: variable 'ansible_search_path' from source: unknown 12372 1727204083.95777: variable 'ansible_search_path' from source: unknown 12372 1727204083.95820: calling self._execute() 12372 1727204083.95925: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204083.95933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204083.95948: variable 'omit' from source: magic vars 12372 1727204083.96783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.00871: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.00975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.01026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.01125: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.01129: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.01205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.01250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.01281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.01344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.01384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.01538: variable 'ansible_distribution' from source: facts 12372 1727204084.01563: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.01566: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.01569: when evaluation is False, skipping this task 12372 1727204084.01668: _execute() done 12372 1727204084.01671: dumping result to json 12372 1727204084.01673: done dumping result, returning 12372 1727204084.01676: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-244a-02f9-000000000128] 12372 1727204084.01678: sending task result for task 12b410aa-8751-244a-02f9-000000000128 12372 1727204084.01751: done sending task result for task 12b410aa-8751-244a-02f9-000000000128 12372 1727204084.01754: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.01825: no more pending results, returning what we have 12372 1727204084.01829: results queue empty 12372 1727204084.01830: checking for any_errors_fatal 12372 1727204084.01838: done checking for any_errors_fatal 12372 1727204084.01839: checking for max_fail_percentage 12372 1727204084.01840: done checking for max_fail_percentage 12372 1727204084.01841: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.01843: done checking to see if all hosts have failed 12372 1727204084.01844: getting the remaining hosts for this loop 12372 1727204084.01845: done getting the remaining hosts for this loop 12372 1727204084.01850: getting the next task for host managed-node3 12372 1727204084.01856: done getting next task for host managed-node3 12372 1727204084.01861: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204084.01864: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.01882: getting variables 12372 1727204084.01884: in VariableManager get_vars() 12372 1727204084.01939: Calling all_inventory to load vars for managed-node3 12372 1727204084.01943: Calling groups_inventory to load vars for managed-node3 12372 1727204084.01946: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.01957: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.01960: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.01963: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.02207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.02548: done with get_vars() 12372 1727204084.02563: done getting variables 12372 1727204084.02653: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.077) 0:00:11.012 ***** 12372 1727204084.02936: entering _queue_task() for managed-node3/service 12372 1727204084.03509: worker is 1 (out of 1 available) 12372 1727204084.03526: exiting _queue_task() for managed-node3/service 12372 1727204084.03541: done queuing things up, now waiting for results queue to drain 12372 1727204084.03543: waiting for pending results... 12372 1727204084.04130: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204084.04440: in run() - task 12b410aa-8751-244a-02f9-000000000129 12372 1727204084.04463: variable 'ansible_search_path' from source: unknown 12372 1727204084.04477: variable 'ansible_search_path' from source: unknown 12372 1727204084.04709: calling self._execute() 12372 1727204084.04713: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.04718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.04721: variable 'omit' from source: magic vars 12372 1727204084.05341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.08199: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.08259: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.08297: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.08329: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.08354: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.08428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.08453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.08476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.08513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.08529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.08647: variable 'ansible_distribution' from source: facts 12372 1727204084.08653: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.08665: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.08669: when evaluation is False, skipping this task 12372 1727204084.08672: _execute() done 12372 1727204084.08674: dumping result to json 12372 1727204084.08679: done dumping result, returning 12372 1727204084.08687: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-244a-02f9-000000000129] 12372 1727204084.08695: sending task result for task 12b410aa-8751-244a-02f9-000000000129 12372 1727204084.08790: done sending task result for task 12b410aa-8751-244a-02f9-000000000129 12372 1727204084.08794: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204084.08843: no more pending results, returning what we have 12372 1727204084.08847: results queue empty 12372 1727204084.08848: checking for any_errors_fatal 12372 1727204084.08855: done checking for any_errors_fatal 12372 1727204084.08856: checking for max_fail_percentage 12372 1727204084.08857: done checking for max_fail_percentage 12372 1727204084.08859: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.08860: done checking to see if all hosts have failed 12372 1727204084.08861: getting the remaining hosts for this loop 12372 1727204084.08862: done getting the remaining hosts for this loop 12372 1727204084.08866: getting the next task for host managed-node3 12372 1727204084.08873: done getting next task for host managed-node3 12372 1727204084.08879: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204084.08882: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.08905: getting variables 12372 1727204084.08907: in VariableManager get_vars() 12372 1727204084.08958: Calling all_inventory to load vars for managed-node3 12372 1727204084.08961: Calling groups_inventory to load vars for managed-node3 12372 1727204084.08964: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.08973: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.08976: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.08979: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.09180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.09352: done with get_vars() 12372 1727204084.09361: done getting variables 12372 1727204084.09411: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.067) 0:00:11.080 ***** 12372 1727204084.09439: entering _queue_task() for managed-node3/copy 12372 1727204084.09644: worker is 1 (out of 1 available) 12372 1727204084.09659: exiting _queue_task() for managed-node3/copy 12372 1727204084.09673: done queuing things up, now waiting for results queue to drain 12372 1727204084.09675: waiting for pending results... 12372 1727204084.09866: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204084.10099: in run() - task 12b410aa-8751-244a-02f9-00000000012a 12372 1727204084.10103: variable 'ansible_search_path' from source: unknown 12372 1727204084.10106: variable 'ansible_search_path' from source: unknown 12372 1727204084.10108: calling self._execute() 12372 1727204084.10242: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.10246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.10249: variable 'omit' from source: magic vars 12372 1727204084.10695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.12541: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.12605: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.12638: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.12668: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.12696: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.12763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.12789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.12815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.12850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.12862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.12975: variable 'ansible_distribution' from source: facts 12372 1727204084.12980: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.12992: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.12995: when evaluation is False, skipping this task 12372 1727204084.13000: _execute() done 12372 1727204084.13003: dumping result to json 12372 1727204084.13010: done dumping result, returning 12372 1727204084.13018: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-244a-02f9-00000000012a] 12372 1727204084.13027: sending task result for task 12b410aa-8751-244a-02f9-00000000012a 12372 1727204084.13122: done sending task result for task 12b410aa-8751-244a-02f9-00000000012a 12372 1727204084.13125: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.13176: no more pending results, returning what we have 12372 1727204084.13179: results queue empty 12372 1727204084.13180: checking for any_errors_fatal 12372 1727204084.13188: done checking for any_errors_fatal 12372 1727204084.13188: checking for max_fail_percentage 12372 1727204084.13192: done checking for max_fail_percentage 12372 1727204084.13193: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.13194: done checking to see if all hosts have failed 12372 1727204084.13195: getting the remaining hosts for this loop 12372 1727204084.13196: done getting the remaining hosts for this loop 12372 1727204084.13201: getting the next task for host managed-node3 12372 1727204084.13207: done getting next task for host managed-node3 12372 1727204084.13211: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204084.13214: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.13236: getting variables 12372 1727204084.13238: in VariableManager get_vars() 12372 1727204084.13288: Calling all_inventory to load vars for managed-node3 12372 1727204084.13338: Calling groups_inventory to load vars for managed-node3 12372 1727204084.13342: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.13354: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.13358: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.13362: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.13581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.13879: done with get_vars() 12372 1727204084.13896: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.045) 0:00:11.125 ***** 12372 1727204084.14001: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204084.14255: worker is 1 (out of 1 available) 12372 1727204084.14270: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204084.14282: done queuing things up, now waiting for results queue to drain 12372 1727204084.14284: waiting for pending results... 12372 1727204084.14613: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204084.14736: in run() - task 12b410aa-8751-244a-02f9-00000000012b 12372 1727204084.14759: variable 'ansible_search_path' from source: unknown 12372 1727204084.14795: variable 'ansible_search_path' from source: unknown 12372 1727204084.14829: calling self._execute() 12372 1727204084.14973: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.14991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.15059: variable 'omit' from source: magic vars 12372 1727204084.15532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.17495: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.17540: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.17594: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.17650: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.17681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.17860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.17864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.17867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.17905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.17925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.18090: variable 'ansible_distribution' from source: facts 12372 1727204084.18097: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.18111: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.18114: when evaluation is False, skipping this task 12372 1727204084.18117: _execute() done 12372 1727204084.18125: dumping result to json 12372 1727204084.18129: done dumping result, returning 12372 1727204084.18139: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-244a-02f9-00000000012b] 12372 1727204084.18145: sending task result for task 12b410aa-8751-244a-02f9-00000000012b 12372 1727204084.18258: done sending task result for task 12b410aa-8751-244a-02f9-00000000012b 12372 1727204084.18261: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.18342: no more pending results, returning what we have 12372 1727204084.18346: results queue empty 12372 1727204084.18347: checking for any_errors_fatal 12372 1727204084.18355: done checking for any_errors_fatal 12372 1727204084.18356: checking for max_fail_percentage 12372 1727204084.18358: done checking for max_fail_percentage 12372 1727204084.18359: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.18360: done checking to see if all hosts have failed 12372 1727204084.18361: getting the remaining hosts for this loop 12372 1727204084.18363: done getting the remaining hosts for this loop 12372 1727204084.18366: getting the next task for host managed-node3 12372 1727204084.18371: done getting next task for host managed-node3 12372 1727204084.18375: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204084.18378: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.18397: getting variables 12372 1727204084.18399: in VariableManager get_vars() 12372 1727204084.18453: Calling all_inventory to load vars for managed-node3 12372 1727204084.18457: Calling groups_inventory to load vars for managed-node3 12372 1727204084.18461: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.18470: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.18473: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.18477: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.18769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.19083: done with get_vars() 12372 1727204084.19109: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.052) 0:00:11.177 ***** 12372 1727204084.19203: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204084.19439: worker is 1 (out of 1 available) 12372 1727204084.19454: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204084.19468: done queuing things up, now waiting for results queue to drain 12372 1727204084.19469: waiting for pending results... 12372 1727204084.19695: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204084.19831: in run() - task 12b410aa-8751-244a-02f9-00000000012c 12372 1727204084.19887: variable 'ansible_search_path' from source: unknown 12372 1727204084.19893: variable 'ansible_search_path' from source: unknown 12372 1727204084.19897: calling self._execute() 12372 1727204084.20024: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.20034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.20038: variable 'omit' from source: magic vars 12372 1727204084.20583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.22492: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.22550: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.22583: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.22617: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.22642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.22713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.22739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.22763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.22800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.22813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.22951: variable 'ansible_distribution' from source: facts 12372 1727204084.22956: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.22959: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.22961: when evaluation is False, skipping this task 12372 1727204084.22964: _execute() done 12372 1727204084.23046: dumping result to json 12372 1727204084.23049: done dumping result, returning 12372 1727204084.23051: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-244a-02f9-00000000012c] 12372 1727204084.23053: sending task result for task 12b410aa-8751-244a-02f9-00000000012c 12372 1727204084.23121: done sending task result for task 12b410aa-8751-244a-02f9-00000000012c 12372 1727204084.23124: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.23219: no more pending results, returning what we have 12372 1727204084.23222: results queue empty 12372 1727204084.23223: checking for any_errors_fatal 12372 1727204084.23229: done checking for any_errors_fatal 12372 1727204084.23230: checking for max_fail_percentage 12372 1727204084.23232: done checking for max_fail_percentage 12372 1727204084.23237: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.23238: done checking to see if all hosts have failed 12372 1727204084.23239: getting the remaining hosts for this loop 12372 1727204084.23240: done getting the remaining hosts for this loop 12372 1727204084.23244: getting the next task for host managed-node3 12372 1727204084.23249: done getting next task for host managed-node3 12372 1727204084.23253: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204084.23256: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.23275: getting variables 12372 1727204084.23277: in VariableManager get_vars() 12372 1727204084.23331: Calling all_inventory to load vars for managed-node3 12372 1727204084.23334: Calling groups_inventory to load vars for managed-node3 12372 1727204084.23337: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.23347: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.23351: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.23355: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.23621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.23950: done with get_vars() 12372 1727204084.23964: done getting variables 12372 1727204084.24138: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.049) 0:00:11.227 ***** 12372 1727204084.24197: entering _queue_task() for managed-node3/debug 12372 1727204084.24482: worker is 1 (out of 1 available) 12372 1727204084.24497: exiting _queue_task() for managed-node3/debug 12372 1727204084.24508: done queuing things up, now waiting for results queue to drain 12372 1727204084.24510: waiting for pending results... 12372 1727204084.24704: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204084.24801: in run() - task 12b410aa-8751-244a-02f9-00000000012d 12372 1727204084.24814: variable 'ansible_search_path' from source: unknown 12372 1727204084.24817: variable 'ansible_search_path' from source: unknown 12372 1727204084.24854: calling self._execute() 12372 1727204084.24928: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.24934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.24945: variable 'omit' from source: magic vars 12372 1727204084.25364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.27092: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.27158: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.27190: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.27223: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.27249: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.27316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.27344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.27368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.27402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.27415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.27527: variable 'ansible_distribution' from source: facts 12372 1727204084.27532: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.27542: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.27545: when evaluation is False, skipping this task 12372 1727204084.27549: _execute() done 12372 1727204084.27554: dumping result to json 12372 1727204084.27559: done dumping result, returning 12372 1727204084.27567: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-244a-02f9-00000000012d] 12372 1727204084.27578: sending task result for task 12b410aa-8751-244a-02f9-00000000012d 12372 1727204084.27665: done sending task result for task 12b410aa-8751-244a-02f9-00000000012d 12372 1727204084.27668: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204084.27721: no more pending results, returning what we have 12372 1727204084.27725: results queue empty 12372 1727204084.27726: checking for any_errors_fatal 12372 1727204084.27733: done checking for any_errors_fatal 12372 1727204084.27734: checking for max_fail_percentage 12372 1727204084.27735: done checking for max_fail_percentage 12372 1727204084.27737: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.27738: done checking to see if all hosts have failed 12372 1727204084.27738: getting the remaining hosts for this loop 12372 1727204084.27740: done getting the remaining hosts for this loop 12372 1727204084.27744: getting the next task for host managed-node3 12372 1727204084.27750: done getting next task for host managed-node3 12372 1727204084.27754: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204084.27757: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.27775: getting variables 12372 1727204084.27776: in VariableManager get_vars() 12372 1727204084.27826: Calling all_inventory to load vars for managed-node3 12372 1727204084.27829: Calling groups_inventory to load vars for managed-node3 12372 1727204084.27832: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.27841: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.27844: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.27848: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.28033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.28200: done with get_vars() 12372 1727204084.28209: done getting variables 12372 1727204084.28255: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.040) 0:00:11.268 ***** 12372 1727204084.28280: entering _queue_task() for managed-node3/debug 12372 1727204084.28466: worker is 1 (out of 1 available) 12372 1727204084.28478: exiting _queue_task() for managed-node3/debug 12372 1727204084.28524: done queuing things up, now waiting for results queue to drain 12372 1727204084.28526: waiting for pending results... 12372 1727204084.28734: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204084.28829: in run() - task 12b410aa-8751-244a-02f9-00000000012e 12372 1727204084.28842: variable 'ansible_search_path' from source: unknown 12372 1727204084.28846: variable 'ansible_search_path' from source: unknown 12372 1727204084.28881: calling self._execute() 12372 1727204084.28953: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.28963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.28976: variable 'omit' from source: magic vars 12372 1727204084.29333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.31066: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.31123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.31157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.31186: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.31211: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.31282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.31308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.31331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.31367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.31380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.31488: variable 'ansible_distribution' from source: facts 12372 1727204084.31496: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.31507: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.31510: when evaluation is False, skipping this task 12372 1727204084.31512: _execute() done 12372 1727204084.31520: dumping result to json 12372 1727204084.31522: done dumping result, returning 12372 1727204084.31531: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-244a-02f9-00000000012e] 12372 1727204084.31536: sending task result for task 12b410aa-8751-244a-02f9-00000000012e 12372 1727204084.31628: done sending task result for task 12b410aa-8751-244a-02f9-00000000012e 12372 1727204084.31631: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204084.31681: no more pending results, returning what we have 12372 1727204084.31685: results queue empty 12372 1727204084.31686: checking for any_errors_fatal 12372 1727204084.31694: done checking for any_errors_fatal 12372 1727204084.31695: checking for max_fail_percentage 12372 1727204084.31697: done checking for max_fail_percentage 12372 1727204084.31698: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.31699: done checking to see if all hosts have failed 12372 1727204084.31700: getting the remaining hosts for this loop 12372 1727204084.31701: done getting the remaining hosts for this loop 12372 1727204084.31705: getting the next task for host managed-node3 12372 1727204084.31711: done getting next task for host managed-node3 12372 1727204084.31718: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204084.31721: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.31739: getting variables 12372 1727204084.31742: in VariableManager get_vars() 12372 1727204084.31798: Calling all_inventory to load vars for managed-node3 12372 1727204084.31801: Calling groups_inventory to load vars for managed-node3 12372 1727204084.31804: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.31813: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.31819: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.31822: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.31964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.32160: done with get_vars() 12372 1727204084.32169: done getting variables 12372 1727204084.32215: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.039) 0:00:11.308 ***** 12372 1727204084.32244: entering _queue_task() for managed-node3/debug 12372 1727204084.32443: worker is 1 (out of 1 available) 12372 1727204084.32458: exiting _queue_task() for managed-node3/debug 12372 1727204084.32471: done queuing things up, now waiting for results queue to drain 12372 1727204084.32473: waiting for pending results... 12372 1727204084.32660: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204084.32763: in run() - task 12b410aa-8751-244a-02f9-00000000012f 12372 1727204084.32775: variable 'ansible_search_path' from source: unknown 12372 1727204084.32778: variable 'ansible_search_path' from source: unknown 12372 1727204084.32821: calling self._execute() 12372 1727204084.32888: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.32897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.32907: variable 'omit' from source: magic vars 12372 1727204084.33796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.36599: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.36709: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.36761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.36816: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.36863: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.36977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.37025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.37196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.37200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.37202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.37331: variable 'ansible_distribution' from source: facts 12372 1727204084.37337: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.37348: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.37352: when evaluation is False, skipping this task 12372 1727204084.37354: _execute() done 12372 1727204084.37359: dumping result to json 12372 1727204084.37364: done dumping result, returning 12372 1727204084.37373: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-244a-02f9-00000000012f] 12372 1727204084.37378: sending task result for task 12b410aa-8751-244a-02f9-00000000012f 12372 1727204084.37472: done sending task result for task 12b410aa-8751-244a-02f9-00000000012f 12372 1727204084.37476: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204084.37529: no more pending results, returning what we have 12372 1727204084.37537: results queue empty 12372 1727204084.37539: checking for any_errors_fatal 12372 1727204084.37546: done checking for any_errors_fatal 12372 1727204084.37547: checking for max_fail_percentage 12372 1727204084.37550: done checking for max_fail_percentage 12372 1727204084.37551: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.37552: done checking to see if all hosts have failed 12372 1727204084.37553: getting the remaining hosts for this loop 12372 1727204084.37554: done getting the remaining hosts for this loop 12372 1727204084.37559: getting the next task for host managed-node3 12372 1727204084.37566: done getting next task for host managed-node3 12372 1727204084.37570: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204084.37574: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.37594: getting variables 12372 1727204084.37596: in VariableManager get_vars() 12372 1727204084.37651: Calling all_inventory to load vars for managed-node3 12372 1727204084.37655: Calling groups_inventory to load vars for managed-node3 12372 1727204084.37657: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.37666: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.37669: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.37672: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.37825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.37998: done with get_vars() 12372 1727204084.38007: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.058) 0:00:11.366 ***** 12372 1727204084.38082: entering _queue_task() for managed-node3/ping 12372 1727204084.38282: worker is 1 (out of 1 available) 12372 1727204084.38299: exiting _queue_task() for managed-node3/ping 12372 1727204084.38310: done queuing things up, now waiting for results queue to drain 12372 1727204084.38312: waiting for pending results... 12372 1727204084.38488: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204084.38587: in run() - task 12b410aa-8751-244a-02f9-000000000130 12372 1727204084.38601: variable 'ansible_search_path' from source: unknown 12372 1727204084.38605: variable 'ansible_search_path' from source: unknown 12372 1727204084.38640: calling self._execute() 12372 1727204084.38708: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.38715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.38727: variable 'omit' from source: magic vars 12372 1727204084.39092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.40836: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.40885: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.40917: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.40953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.40975: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.41043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.41071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.41093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.41128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.41141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.41253: variable 'ansible_distribution' from source: facts 12372 1727204084.41258: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.41271: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.41276: when evaluation is False, skipping this task 12372 1727204084.41279: _execute() done 12372 1727204084.41282: dumping result to json 12372 1727204084.41284: done dumping result, returning 12372 1727204084.41383: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-244a-02f9-000000000130] 12372 1727204084.41386: sending task result for task 12b410aa-8751-244a-02f9-000000000130 12372 1727204084.41452: done sending task result for task 12b410aa-8751-244a-02f9-000000000130 12372 1727204084.41454: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.41522: no more pending results, returning what we have 12372 1727204084.41525: results queue empty 12372 1727204084.41526: checking for any_errors_fatal 12372 1727204084.41530: done checking for any_errors_fatal 12372 1727204084.41530: checking for max_fail_percentage 12372 1727204084.41531: done checking for max_fail_percentage 12372 1727204084.41532: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.41533: done checking to see if all hosts have failed 12372 1727204084.41534: getting the remaining hosts for this loop 12372 1727204084.41534: done getting the remaining hosts for this loop 12372 1727204084.41537: getting the next task for host managed-node3 12372 1727204084.41543: done getting next task for host managed-node3 12372 1727204084.41545: ^ task is: TASK: meta (role_complete) 12372 1727204084.41548: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.41562: getting variables 12372 1727204084.41564: in VariableManager get_vars() 12372 1727204084.41603: Calling all_inventory to load vars for managed-node3 12372 1727204084.41605: Calling groups_inventory to load vars for managed-node3 12372 1727204084.41607: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.41614: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.41616: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.41619: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.41794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.41961: done with get_vars() 12372 1727204084.41969: done getting variables 12372 1727204084.42033: done queuing things up, now waiting for results queue to drain 12372 1727204084.42035: results queue empty 12372 1727204084.42035: checking for any_errors_fatal 12372 1727204084.42037: done checking for any_errors_fatal 12372 1727204084.42038: checking for max_fail_percentage 12372 1727204084.42038: done checking for max_fail_percentage 12372 1727204084.42039: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.42040: done checking to see if all hosts have failed 12372 1727204084.42040: getting the remaining hosts for this loop 12372 1727204084.42041: done getting the remaining hosts for this loop 12372 1727204084.42043: getting the next task for host managed-node3 12372 1727204084.42045: done getting next task for host managed-node3 12372 1727204084.42047: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 12372 1727204084.42048: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.42050: getting variables 12372 1727204084.42050: in VariableManager get_vars() 12372 1727204084.42066: Calling all_inventory to load vars for managed-node3 12372 1727204084.42068: Calling groups_inventory to load vars for managed-node3 12372 1727204084.42069: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.42073: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.42074: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.42077: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.42191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.42342: done with get_vars() 12372 1727204084.42350: done getting variables 12372 1727204084.42381: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12372 1727204084.42478: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.044) 0:00:11.410 ***** 12372 1727204084.42502: entering _queue_task() for managed-node3/command 12372 1727204084.42696: worker is 1 (out of 1 available) 12372 1727204084.42711: exiting _queue_task() for managed-node3/command 12372 1727204084.42724: done queuing things up, now waiting for results queue to drain 12372 1727204084.42726: waiting for pending results... 12372 1727204084.42897: running TaskExecutor() for managed-node3/TASK: From the active connection, get the controller profile "bond0" 12372 1727204084.42967: in run() - task 12b410aa-8751-244a-02f9-000000000160 12372 1727204084.42979: variable 'ansible_search_path' from source: unknown 12372 1727204084.43013: calling self._execute() 12372 1727204084.43083: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.43093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.43102: variable 'omit' from source: magic vars 12372 1727204084.43457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.45202: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.45257: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.45287: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.45318: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.45344: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.45411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.45438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.45460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.45497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.45511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.45618: variable 'ansible_distribution' from source: facts 12372 1727204084.45626: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.45636: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.45639: when evaluation is False, skipping this task 12372 1727204084.45644: _execute() done 12372 1727204084.45647: dumping result to json 12372 1727204084.45652: done dumping result, returning 12372 1727204084.45659: done running TaskExecutor() for managed-node3/TASK: From the active connection, get the controller profile "bond0" [12b410aa-8751-244a-02f9-000000000160] 12372 1727204084.45665: sending task result for task 12b410aa-8751-244a-02f9-000000000160 12372 1727204084.45754: done sending task result for task 12b410aa-8751-244a-02f9-000000000160 12372 1727204084.45757: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.45832: no more pending results, returning what we have 12372 1727204084.45836: results queue empty 12372 1727204084.45837: checking for any_errors_fatal 12372 1727204084.45839: done checking for any_errors_fatal 12372 1727204084.45840: checking for max_fail_percentage 12372 1727204084.45841: done checking for max_fail_percentage 12372 1727204084.45842: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.45844: done checking to see if all hosts have failed 12372 1727204084.45844: getting the remaining hosts for this loop 12372 1727204084.45846: done getting the remaining hosts for this loop 12372 1727204084.45849: getting the next task for host managed-node3 12372 1727204084.45854: done getting next task for host managed-node3 12372 1727204084.45857: ^ task is: TASK: Assert that the controller profile is activated 12372 1727204084.45859: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.45862: getting variables 12372 1727204084.45864: in VariableManager get_vars() 12372 1727204084.45913: Calling all_inventory to load vars for managed-node3 12372 1727204084.45917: Calling groups_inventory to load vars for managed-node3 12372 1727204084.45920: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.45927: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.45929: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.45931: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.46103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.46263: done with get_vars() 12372 1727204084.46271: done getting variables 12372 1727204084.46316: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.038) 0:00:11.449 ***** 12372 1727204084.46338: entering _queue_task() for managed-node3/assert 12372 1727204084.46525: worker is 1 (out of 1 available) 12372 1727204084.46541: exiting _queue_task() for managed-node3/assert 12372 1727204084.46554: done queuing things up, now waiting for results queue to drain 12372 1727204084.46556: waiting for pending results... 12372 1727204084.46737: running TaskExecutor() for managed-node3/TASK: Assert that the controller profile is activated 12372 1727204084.46810: in run() - task 12b410aa-8751-244a-02f9-000000000161 12372 1727204084.46824: variable 'ansible_search_path' from source: unknown 12372 1727204084.46856: calling self._execute() 12372 1727204084.46936: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.46942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.46952: variable 'omit' from source: magic vars 12372 1727204084.47309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.49022: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.49078: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.49113: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.49146: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.49170: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.49239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.49262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.49284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.49325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.49339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.49447: variable 'ansible_distribution' from source: facts 12372 1727204084.49453: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.49463: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.49467: when evaluation is False, skipping this task 12372 1727204084.49470: _execute() done 12372 1727204084.49473: dumping result to json 12372 1727204084.49479: done dumping result, returning 12372 1727204084.49486: done running TaskExecutor() for managed-node3/TASK: Assert that the controller profile is activated [12b410aa-8751-244a-02f9-000000000161] 12372 1727204084.49493: sending task result for task 12b410aa-8751-244a-02f9-000000000161 12372 1727204084.49586: done sending task result for task 12b410aa-8751-244a-02f9-000000000161 12372 1727204084.49597: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.49657: no more pending results, returning what we have 12372 1727204084.49661: results queue empty 12372 1727204084.49662: checking for any_errors_fatal 12372 1727204084.49667: done checking for any_errors_fatal 12372 1727204084.49668: checking for max_fail_percentage 12372 1727204084.49670: done checking for max_fail_percentage 12372 1727204084.49671: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.49672: done checking to see if all hosts have failed 12372 1727204084.49673: getting the remaining hosts for this loop 12372 1727204084.49674: done getting the remaining hosts for this loop 12372 1727204084.49677: getting the next task for host managed-node3 12372 1727204084.49682: done getting next task for host managed-node3 12372 1727204084.49685: ^ task is: TASK: Get the controller device details 12372 1727204084.49688: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.49693: getting variables 12372 1727204084.49694: in VariableManager get_vars() 12372 1727204084.49749: Calling all_inventory to load vars for managed-node3 12372 1727204084.49751: Calling groups_inventory to load vars for managed-node3 12372 1727204084.49753: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.49760: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.49762: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.49764: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.49906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.50071: done with get_vars() 12372 1727204084.50079: done getting variables 12372 1727204084.50126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.038) 0:00:11.487 ***** 12372 1727204084.50149: entering _queue_task() for managed-node3/command 12372 1727204084.50339: worker is 1 (out of 1 available) 12372 1727204084.50355: exiting _queue_task() for managed-node3/command 12372 1727204084.50369: done queuing things up, now waiting for results queue to drain 12372 1727204084.50370: waiting for pending results... 12372 1727204084.50547: running TaskExecutor() for managed-node3/TASK: Get the controller device details 12372 1727204084.50615: in run() - task 12b410aa-8751-244a-02f9-000000000162 12372 1727204084.50630: variable 'ansible_search_path' from source: unknown 12372 1727204084.50660: calling self._execute() 12372 1727204084.50729: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.50736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.50746: variable 'omit' from source: magic vars 12372 1727204084.51095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.52823: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.52873: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.52909: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.52940: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.52963: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.53034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.53058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.53079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.53119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.53132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.53247: variable 'ansible_distribution' from source: facts 12372 1727204084.53252: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.53263: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.53266: when evaluation is False, skipping this task 12372 1727204084.53270: _execute() done 12372 1727204084.53274: dumping result to json 12372 1727204084.53279: done dumping result, returning 12372 1727204084.53286: done running TaskExecutor() for managed-node3/TASK: Get the controller device details [12b410aa-8751-244a-02f9-000000000162] 12372 1727204084.53294: sending task result for task 12b410aa-8751-244a-02f9-000000000162 12372 1727204084.53387: done sending task result for task 12b410aa-8751-244a-02f9-000000000162 12372 1727204084.53398: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.53470: no more pending results, returning what we have 12372 1727204084.53473: results queue empty 12372 1727204084.53475: checking for any_errors_fatal 12372 1727204084.53483: done checking for any_errors_fatal 12372 1727204084.53484: checking for max_fail_percentage 12372 1727204084.53485: done checking for max_fail_percentage 12372 1727204084.53486: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.53487: done checking to see if all hosts have failed 12372 1727204084.53488: getting the remaining hosts for this loop 12372 1727204084.53492: done getting the remaining hosts for this loop 12372 1727204084.53495: getting the next task for host managed-node3 12372 1727204084.53507: done getting next task for host managed-node3 12372 1727204084.53510: ^ task is: TASK: Assert that the controller profile is activated 12372 1727204084.53513: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204084.53519: getting variables 12372 1727204084.53520: in VariableManager get_vars() 12372 1727204084.53570: Calling all_inventory to load vars for managed-node3 12372 1727204084.53574: Calling groups_inventory to load vars for managed-node3 12372 1727204084.53577: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.53586: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.53591: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.53595: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.53778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.53944: done with get_vars() 12372 1727204084.53955: done getting variables 12372 1727204084.54002: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.038) 0:00:11.525 ***** 12372 1727204084.54026: entering _queue_task() for managed-node3/assert 12372 1727204084.54243: worker is 1 (out of 1 available) 12372 1727204084.54255: exiting _queue_task() for managed-node3/assert 12372 1727204084.54269: done queuing things up, now waiting for results queue to drain 12372 1727204084.54271: waiting for pending results... 12372 1727204084.54458: running TaskExecutor() for managed-node3/TASK: Assert that the controller profile is activated 12372 1727204084.54535: in run() - task 12b410aa-8751-244a-02f9-000000000163 12372 1727204084.54548: variable 'ansible_search_path' from source: unknown 12372 1727204084.54578: calling self._execute() 12372 1727204084.54649: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.54656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.54665: variable 'omit' from source: magic vars 12372 1727204084.55025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.56785: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.56994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.57030: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.57059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.57081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.57154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.57179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.57202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.57242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.57254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.57362: variable 'ansible_distribution' from source: facts 12372 1727204084.57366: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.57378: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.57381: when evaluation is False, skipping this task 12372 1727204084.57383: _execute() done 12372 1727204084.57387: dumping result to json 12372 1727204084.57393: done dumping result, returning 12372 1727204084.57401: done running TaskExecutor() for managed-node3/TASK: Assert that the controller profile is activated [12b410aa-8751-244a-02f9-000000000163] 12372 1727204084.57406: sending task result for task 12b410aa-8751-244a-02f9-000000000163 12372 1727204084.57498: done sending task result for task 12b410aa-8751-244a-02f9-000000000163 12372 1727204084.57501: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.57552: no more pending results, returning what we have 12372 1727204084.57556: results queue empty 12372 1727204084.57557: checking for any_errors_fatal 12372 1727204084.57562: done checking for any_errors_fatal 12372 1727204084.57563: checking for max_fail_percentage 12372 1727204084.57564: done checking for max_fail_percentage 12372 1727204084.57565: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.57566: done checking to see if all hosts have failed 12372 1727204084.57567: getting the remaining hosts for this loop 12372 1727204084.57569: done getting the remaining hosts for this loop 12372 1727204084.57573: getting the next task for host managed-node3 12372 1727204084.57585: done getting next task for host managed-node3 12372 1727204084.57591: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204084.57596: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204084.57614: getting variables 12372 1727204084.57618: in VariableManager get_vars() 12372 1727204084.57699: Calling all_inventory to load vars for managed-node3 12372 1727204084.57703: Calling groups_inventory to load vars for managed-node3 12372 1727204084.57707: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.57719: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.57722: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.57726: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.57874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.58054: done with get_vars() 12372 1727204084.58063: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.041) 0:00:11.567 ***** 12372 1727204084.58145: entering _queue_task() for managed-node3/include_tasks 12372 1727204084.58347: worker is 1 (out of 1 available) 12372 1727204084.58364: exiting _queue_task() for managed-node3/include_tasks 12372 1727204084.58376: done queuing things up, now waiting for results queue to drain 12372 1727204084.58378: waiting for pending results... 12372 1727204084.58550: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12372 1727204084.58655: in run() - task 12b410aa-8751-244a-02f9-00000000016c 12372 1727204084.58669: variable 'ansible_search_path' from source: unknown 12372 1727204084.58673: variable 'ansible_search_path' from source: unknown 12372 1727204084.58705: calling self._execute() 12372 1727204084.58776: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.58783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.58828: variable 'omit' from source: magic vars 12372 1727204084.59200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.61630: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.61682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.61713: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.61748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.61771: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.61841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.61867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.61890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.61924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.61939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.62050: variable 'ansible_distribution' from source: facts 12372 1727204084.62054: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.62067: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.62071: when evaluation is False, skipping this task 12372 1727204084.62074: _execute() done 12372 1727204084.62077: dumping result to json 12372 1727204084.62083: done dumping result, returning 12372 1727204084.62092: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-244a-02f9-00000000016c] 12372 1727204084.62098: sending task result for task 12b410aa-8751-244a-02f9-00000000016c 12372 1727204084.62194: done sending task result for task 12b410aa-8751-244a-02f9-00000000016c 12372 1727204084.62199: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.62252: no more pending results, returning what we have 12372 1727204084.62256: results queue empty 12372 1727204084.62258: checking for any_errors_fatal 12372 1727204084.62266: done checking for any_errors_fatal 12372 1727204084.62266: checking for max_fail_percentage 12372 1727204084.62268: done checking for max_fail_percentage 12372 1727204084.62269: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.62270: done checking to see if all hosts have failed 12372 1727204084.62271: getting the remaining hosts for this loop 12372 1727204084.62273: done getting the remaining hosts for this loop 12372 1727204084.62277: getting the next task for host managed-node3 12372 1727204084.62285: done getting next task for host managed-node3 12372 1727204084.62291: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204084.62296: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204084.62318: getting variables 12372 1727204084.62319: in VariableManager get_vars() 12372 1727204084.62371: Calling all_inventory to load vars for managed-node3 12372 1727204084.62374: Calling groups_inventory to load vars for managed-node3 12372 1727204084.62377: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.62386: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.62395: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.62399: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.62573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.62742: done with get_vars() 12372 1727204084.62752: done getting variables 12372 1727204084.62799: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.046) 0:00:11.613 ***** 12372 1727204084.62826: entering _queue_task() for managed-node3/debug 12372 1727204084.63019: worker is 1 (out of 1 available) 12372 1727204084.63034: exiting _queue_task() for managed-node3/debug 12372 1727204084.63047: done queuing things up, now waiting for results queue to drain 12372 1727204084.63049: waiting for pending results... 12372 1727204084.63242: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 12372 1727204084.63380: in run() - task 12b410aa-8751-244a-02f9-00000000016d 12372 1727204084.63595: variable 'ansible_search_path' from source: unknown 12372 1727204084.63598: variable 'ansible_search_path' from source: unknown 12372 1727204084.63601: calling self._execute() 12372 1727204084.63605: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.63608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.63611: variable 'omit' from source: magic vars 12372 1727204084.64084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.66927: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.67004: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.67052: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.67099: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.67140: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.67239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.67280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.67331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.67395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.67420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.67580: variable 'ansible_distribution' from source: facts 12372 1727204084.67596: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.67612: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.67623: when evaluation is False, skipping this task 12372 1727204084.67630: _execute() done 12372 1727204084.67638: dumping result to json 12372 1727204084.67646: done dumping result, returning 12372 1727204084.67657: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-244a-02f9-00000000016d] 12372 1727204084.67667: sending task result for task 12b410aa-8751-244a-02f9-00000000016d skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204084.67841: no more pending results, returning what we have 12372 1727204084.67845: results queue empty 12372 1727204084.67846: checking for any_errors_fatal 12372 1727204084.67853: done checking for any_errors_fatal 12372 1727204084.67853: checking for max_fail_percentage 12372 1727204084.67855: done checking for max_fail_percentage 12372 1727204084.67856: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.67857: done checking to see if all hosts have failed 12372 1727204084.67858: getting the remaining hosts for this loop 12372 1727204084.67860: done getting the remaining hosts for this loop 12372 1727204084.67864: getting the next task for host managed-node3 12372 1727204084.67871: done getting next task for host managed-node3 12372 1727204084.67877: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204084.67881: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204084.68070: getting variables 12372 1727204084.68073: in VariableManager get_vars() 12372 1727204084.68129: Calling all_inventory to load vars for managed-node3 12372 1727204084.68132: Calling groups_inventory to load vars for managed-node3 12372 1727204084.68135: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.68142: done sending task result for task 12b410aa-8751-244a-02f9-00000000016d 12372 1727204084.68145: WORKER PROCESS EXITING 12372 1727204084.68154: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.68157: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.68161: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.68394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.68705: done with get_vars() 12372 1727204084.68721: done getting variables 12372 1727204084.68785: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.060) 0:00:11.673 ***** 12372 1727204084.68831: entering _queue_task() for managed-node3/fail 12372 1727204084.69324: worker is 1 (out of 1 available) 12372 1727204084.69333: exiting _queue_task() for managed-node3/fail 12372 1727204084.69345: done queuing things up, now waiting for results queue to drain 12372 1727204084.69347: waiting for pending results... 12372 1727204084.69421: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12372 1727204084.69606: in run() - task 12b410aa-8751-244a-02f9-00000000016e 12372 1727204084.69628: variable 'ansible_search_path' from source: unknown 12372 1727204084.69637: variable 'ansible_search_path' from source: unknown 12372 1727204084.69685: calling self._execute() 12372 1727204084.69778: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.69798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.69814: variable 'omit' from source: magic vars 12372 1727204084.70422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.73562: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.73648: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.73692: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.73731: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.73770: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.73882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.73921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.73953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.74015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.74095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.74197: variable 'ansible_distribution' from source: facts 12372 1727204084.74204: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.74218: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.74222: when evaluation is False, skipping this task 12372 1727204084.74225: _execute() done 12372 1727204084.74227: dumping result to json 12372 1727204084.74230: done dumping result, returning 12372 1727204084.74239: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-244a-02f9-00000000016e] 12372 1727204084.74245: sending task result for task 12b410aa-8751-244a-02f9-00000000016e 12372 1727204084.74579: done sending task result for task 12b410aa-8751-244a-02f9-00000000016e 12372 1727204084.74584: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.74632: no more pending results, returning what we have 12372 1727204084.74636: results queue empty 12372 1727204084.74637: checking for any_errors_fatal 12372 1727204084.74644: done checking for any_errors_fatal 12372 1727204084.74645: checking for max_fail_percentage 12372 1727204084.74647: done checking for max_fail_percentage 12372 1727204084.74648: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.74650: done checking to see if all hosts have failed 12372 1727204084.74651: getting the remaining hosts for this loop 12372 1727204084.74652: done getting the remaining hosts for this loop 12372 1727204084.74656: getting the next task for host managed-node3 12372 1727204084.74662: done getting next task for host managed-node3 12372 1727204084.74667: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204084.74671: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204084.74691: getting variables 12372 1727204084.74693: in VariableManager get_vars() 12372 1727204084.74749: Calling all_inventory to load vars for managed-node3 12372 1727204084.74752: Calling groups_inventory to load vars for managed-node3 12372 1727204084.74755: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.74765: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.74768: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.74772: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.75046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.75348: done with get_vars() 12372 1727204084.75360: done getting variables 12372 1727204084.75435: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.066) 0:00:11.740 ***** 12372 1727204084.75477: entering _queue_task() for managed-node3/fail 12372 1727204084.75754: worker is 1 (out of 1 available) 12372 1727204084.75769: exiting _queue_task() for managed-node3/fail 12372 1727204084.75784: done queuing things up, now waiting for results queue to drain 12372 1727204084.75786: waiting for pending results... 12372 1727204084.76206: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12372 1727204084.76285: in run() - task 12b410aa-8751-244a-02f9-00000000016f 12372 1727204084.76309: variable 'ansible_search_path' from source: unknown 12372 1727204084.76326: variable 'ansible_search_path' from source: unknown 12372 1727204084.76372: calling self._execute() 12372 1727204084.76472: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.76488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.76507: variable 'omit' from source: magic vars 12372 1727204084.77062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.79996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.80194: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.80198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.80200: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.80206: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.80302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.80351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.80387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.80454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.80476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.80644: variable 'ansible_distribution' from source: facts 12372 1727204084.80657: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.80672: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.80680: when evaluation is False, skipping this task 12372 1727204084.80687: _execute() done 12372 1727204084.80697: dumping result to json 12372 1727204084.80706: done dumping result, returning 12372 1727204084.80721: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-244a-02f9-00000000016f] 12372 1727204084.80733: sending task result for task 12b410aa-8751-244a-02f9-00000000016f 12372 1727204084.81020: done sending task result for task 12b410aa-8751-244a-02f9-00000000016f 12372 1727204084.81024: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.81074: no more pending results, returning what we have 12372 1727204084.81077: results queue empty 12372 1727204084.81078: checking for any_errors_fatal 12372 1727204084.81084: done checking for any_errors_fatal 12372 1727204084.81085: checking for max_fail_percentage 12372 1727204084.81087: done checking for max_fail_percentage 12372 1727204084.81088: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.81091: done checking to see if all hosts have failed 12372 1727204084.81092: getting the remaining hosts for this loop 12372 1727204084.81094: done getting the remaining hosts for this loop 12372 1727204084.81098: getting the next task for host managed-node3 12372 1727204084.81106: done getting next task for host managed-node3 12372 1727204084.81110: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204084.81114: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204084.81137: getting variables 12372 1727204084.81139: in VariableManager get_vars() 12372 1727204084.81300: Calling all_inventory to load vars for managed-node3 12372 1727204084.81304: Calling groups_inventory to load vars for managed-node3 12372 1727204084.81307: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.81320: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.81323: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.81327: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.81597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.82161: done with get_vars() 12372 1727204084.82172: done getting variables 12372 1727204084.82239: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.067) 0:00:11.808 ***** 12372 1727204084.82277: entering _queue_task() for managed-node3/fail 12372 1727204084.82539: worker is 1 (out of 1 available) 12372 1727204084.82552: exiting _queue_task() for managed-node3/fail 12372 1727204084.82564: done queuing things up, now waiting for results queue to drain 12372 1727204084.82566: waiting for pending results... 12372 1727204084.82864: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12372 1727204084.83195: in run() - task 12b410aa-8751-244a-02f9-000000000170 12372 1727204084.83199: variable 'ansible_search_path' from source: unknown 12372 1727204084.83202: variable 'ansible_search_path' from source: unknown 12372 1727204084.83205: calling self._execute() 12372 1727204084.83208: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.83226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.83241: variable 'omit' from source: magic vars 12372 1727204084.83766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.86466: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.86569: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.86625: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.86673: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.86719: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.86820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.86863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.86905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.86971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.86999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.87170: variable 'ansible_distribution' from source: facts 12372 1727204084.87244: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.87247: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.87249: when evaluation is False, skipping this task 12372 1727204084.87252: _execute() done 12372 1727204084.87254: dumping result to json 12372 1727204084.87256: done dumping result, returning 12372 1727204084.87259: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-244a-02f9-000000000170] 12372 1727204084.87261: sending task result for task 12b410aa-8751-244a-02f9-000000000170 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.87405: no more pending results, returning what we have 12372 1727204084.87409: results queue empty 12372 1727204084.87410: checking for any_errors_fatal 12372 1727204084.87421: done checking for any_errors_fatal 12372 1727204084.87422: checking for max_fail_percentage 12372 1727204084.87424: done checking for max_fail_percentage 12372 1727204084.87425: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.87426: done checking to see if all hosts have failed 12372 1727204084.87427: getting the remaining hosts for this loop 12372 1727204084.87429: done getting the remaining hosts for this loop 12372 1727204084.87433: getting the next task for host managed-node3 12372 1727204084.87441: done getting next task for host managed-node3 12372 1727204084.87446: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204084.87451: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204084.87472: getting variables 12372 1727204084.87474: in VariableManager get_vars() 12372 1727204084.87540: Calling all_inventory to load vars for managed-node3 12372 1727204084.87544: Calling groups_inventory to load vars for managed-node3 12372 1727204084.87547: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.87559: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.87562: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.87566: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.88055: done sending task result for task 12b410aa-8751-244a-02f9-000000000170 12372 1727204084.88059: WORKER PROCESS EXITING 12372 1727204084.88086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.88400: done with get_vars() 12372 1727204084.88413: done getting variables 12372 1727204084.88478: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.062) 0:00:11.870 ***** 12372 1727204084.88522: entering _queue_task() for managed-node3/dnf 12372 1727204084.88771: worker is 1 (out of 1 available) 12372 1727204084.88785: exiting _queue_task() for managed-node3/dnf 12372 1727204084.88800: done queuing things up, now waiting for results queue to drain 12372 1727204084.88802: waiting for pending results... 12372 1727204084.89097: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12372 1727204084.89287: in run() - task 12b410aa-8751-244a-02f9-000000000171 12372 1727204084.89309: variable 'ansible_search_path' from source: unknown 12372 1727204084.89324: variable 'ansible_search_path' from source: unknown 12372 1727204084.89368: calling self._execute() 12372 1727204084.89469: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.89542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.89545: variable 'omit' from source: magic vars 12372 1727204084.90029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.92745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.92836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.92884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.92934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.92966: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.93061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.93174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.93178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.93212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.93239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.93421: variable 'ansible_distribution' from source: facts 12372 1727204084.93435: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.93452: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.93461: when evaluation is False, skipping this task 12372 1727204084.93468: _execute() done 12372 1727204084.93476: dumping result to json 12372 1727204084.93484: done dumping result, returning 12372 1727204084.93501: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000171] 12372 1727204084.93595: sending task result for task 12b410aa-8751-244a-02f9-000000000171 12372 1727204084.93674: done sending task result for task 12b410aa-8751-244a-02f9-000000000171 12372 1727204084.93678: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204084.93740: no more pending results, returning what we have 12372 1727204084.93745: results queue empty 12372 1727204084.93746: checking for any_errors_fatal 12372 1727204084.93752: done checking for any_errors_fatal 12372 1727204084.93753: checking for max_fail_percentage 12372 1727204084.93756: done checking for max_fail_percentage 12372 1727204084.93757: checking to see if all hosts have failed and the running result is not ok 12372 1727204084.93759: done checking to see if all hosts have failed 12372 1727204084.93760: getting the remaining hosts for this loop 12372 1727204084.93761: done getting the remaining hosts for this loop 12372 1727204084.93766: getting the next task for host managed-node3 12372 1727204084.93776: done getting next task for host managed-node3 12372 1727204084.93781: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204084.93786: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204084.93810: getting variables 12372 1727204084.93812: in VariableManager get_vars() 12372 1727204084.93879: Calling all_inventory to load vars for managed-node3 12372 1727204084.93883: Calling groups_inventory to load vars for managed-node3 12372 1727204084.93886: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204084.94101: Calling all_plugins_play to load vars for managed-node3 12372 1727204084.94106: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204084.94110: Calling groups_plugins_play to load vars for managed-node3 12372 1727204084.94441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204084.94741: done with get_vars() 12372 1727204084.94754: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12372 1727204084.94842: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.063) 0:00:11.934 ***** 12372 1727204084.94877: entering _queue_task() for managed-node3/yum 12372 1727204084.95149: worker is 1 (out of 1 available) 12372 1727204084.95162: exiting _queue_task() for managed-node3/yum 12372 1727204084.95175: done queuing things up, now waiting for results queue to drain 12372 1727204084.95177: waiting for pending results... 12372 1727204084.95609: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12372 1727204084.95676: in run() - task 12b410aa-8751-244a-02f9-000000000172 12372 1727204084.95697: variable 'ansible_search_path' from source: unknown 12372 1727204084.95710: variable 'ansible_search_path' from source: unknown 12372 1727204084.95754: calling self._execute() 12372 1727204084.95859: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204084.95874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204084.95894: variable 'omit' from source: magic vars 12372 1727204084.96415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204084.99395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204084.99399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204084.99401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204084.99404: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204084.99406: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204084.99465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204084.99508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204084.99553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204084.99612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204084.99641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204084.99809: variable 'ansible_distribution' from source: facts 12372 1727204084.99825: variable 'ansible_distribution_major_version' from source: facts 12372 1727204084.99841: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204084.99855: when evaluation is False, skipping this task 12372 1727204084.99863: _execute() done 12372 1727204084.99870: dumping result to json 12372 1727204084.99879: done dumping result, returning 12372 1727204084.99893: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000172] 12372 1727204084.99904: sending task result for task 12b410aa-8751-244a-02f9-000000000172 skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.00151: no more pending results, returning what we have 12372 1727204085.00155: results queue empty 12372 1727204085.00156: checking for any_errors_fatal 12372 1727204085.00162: done checking for any_errors_fatal 12372 1727204085.00163: checking for max_fail_percentage 12372 1727204085.00165: done checking for max_fail_percentage 12372 1727204085.00166: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.00167: done checking to see if all hosts have failed 12372 1727204085.00168: getting the remaining hosts for this loop 12372 1727204085.00170: done getting the remaining hosts for this loop 12372 1727204085.00174: getting the next task for host managed-node3 12372 1727204085.00183: done getting next task for host managed-node3 12372 1727204085.00187: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204085.00195: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.00219: getting variables 12372 1727204085.00221: in VariableManager get_vars() 12372 1727204085.00282: Calling all_inventory to load vars for managed-node3 12372 1727204085.00285: Calling groups_inventory to load vars for managed-node3 12372 1727204085.00288: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.00504: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.00508: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.00512: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.00745: done sending task result for task 12b410aa-8751-244a-02f9-000000000172 12372 1727204085.00749: WORKER PROCESS EXITING 12372 1727204085.00775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.01135: done with get_vars() 12372 1727204085.01149: done getting variables 12372 1727204085.01233: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.063) 0:00:11.998 ***** 12372 1727204085.01276: entering _queue_task() for managed-node3/fail 12372 1727204085.01644: worker is 1 (out of 1 available) 12372 1727204085.01659: exiting _queue_task() for managed-node3/fail 12372 1727204085.01672: done queuing things up, now waiting for results queue to drain 12372 1727204085.01674: waiting for pending results... 12372 1727204085.02032: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12372 1727204085.02258: in run() - task 12b410aa-8751-244a-02f9-000000000173 12372 1727204085.02381: variable 'ansible_search_path' from source: unknown 12372 1727204085.02385: variable 'ansible_search_path' from source: unknown 12372 1727204085.02388: calling self._execute() 12372 1727204085.02455: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.02470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.02506: variable 'omit' from source: magic vars 12372 1727204085.03087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.06006: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.06212: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.06217: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.06219: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.06262: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.06386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.06440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.06496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.06622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.06626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.06796: variable 'ansible_distribution' from source: facts 12372 1727204085.06810: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.06870: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.06878: when evaluation is False, skipping this task 12372 1727204085.06882: _execute() done 12372 1727204085.06885: dumping result to json 12372 1727204085.06887: done dumping result, returning 12372 1727204085.06904: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000173] 12372 1727204085.06979: sending task result for task 12b410aa-8751-244a-02f9-000000000173 12372 1727204085.07062: done sending task result for task 12b410aa-8751-244a-02f9-000000000173 12372 1727204085.07066: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.07140: no more pending results, returning what we have 12372 1727204085.07144: results queue empty 12372 1727204085.07145: checking for any_errors_fatal 12372 1727204085.07153: done checking for any_errors_fatal 12372 1727204085.07154: checking for max_fail_percentage 12372 1727204085.07156: done checking for max_fail_percentage 12372 1727204085.07157: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.07158: done checking to see if all hosts have failed 12372 1727204085.07159: getting the remaining hosts for this loop 12372 1727204085.07161: done getting the remaining hosts for this loop 12372 1727204085.07166: getting the next task for host managed-node3 12372 1727204085.07174: done getting next task for host managed-node3 12372 1727204085.07181: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12372 1727204085.07186: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.07210: getting variables 12372 1727204085.07213: in VariableManager get_vars() 12372 1727204085.07276: Calling all_inventory to load vars for managed-node3 12372 1727204085.07281: Calling groups_inventory to load vars for managed-node3 12372 1727204085.07284: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.07530: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.07535: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.07540: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.07973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.08304: done with get_vars() 12372 1727204085.08316: done getting variables 12372 1727204085.08384: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.071) 0:00:12.070 ***** 12372 1727204085.08436: entering _queue_task() for managed-node3/package 12372 1727204085.08717: worker is 1 (out of 1 available) 12372 1727204085.08805: exiting _queue_task() for managed-node3/package 12372 1727204085.08816: done queuing things up, now waiting for results queue to drain 12372 1727204085.08818: waiting for pending results... 12372 1727204085.09112: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 12372 1727204085.09248: in run() - task 12b410aa-8751-244a-02f9-000000000174 12372 1727204085.09268: variable 'ansible_search_path' from source: unknown 12372 1727204085.09294: variable 'ansible_search_path' from source: unknown 12372 1727204085.09332: calling self._execute() 12372 1727204085.09436: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.09494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.09498: variable 'omit' from source: magic vars 12372 1727204085.10027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.12878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.12975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.13036: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.13082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.13125: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.13247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.13295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.13327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.13452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.13457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.13606: variable 'ansible_distribution' from source: facts 12372 1727204085.13619: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.13636: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.13644: when evaluation is False, skipping this task 12372 1727204085.13652: _execute() done 12372 1727204085.13669: dumping result to json 12372 1727204085.13674: done dumping result, returning 12372 1727204085.13698: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-244a-02f9-000000000174] 12372 1727204085.13708: sending task result for task 12b410aa-8751-244a-02f9-000000000174 12372 1727204085.13860: done sending task result for task 12b410aa-8751-244a-02f9-000000000174 12372 1727204085.13864: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.13931: no more pending results, returning what we have 12372 1727204085.13935: results queue empty 12372 1727204085.13936: checking for any_errors_fatal 12372 1727204085.13944: done checking for any_errors_fatal 12372 1727204085.13945: checking for max_fail_percentage 12372 1727204085.13948: done checking for max_fail_percentage 12372 1727204085.13949: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.13950: done checking to see if all hosts have failed 12372 1727204085.13951: getting the remaining hosts for this loop 12372 1727204085.13953: done getting the remaining hosts for this loop 12372 1727204085.13958: getting the next task for host managed-node3 12372 1727204085.13965: done getting next task for host managed-node3 12372 1727204085.13970: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204085.13977: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.14002: getting variables 12372 1727204085.14004: in VariableManager get_vars() 12372 1727204085.14064: Calling all_inventory to load vars for managed-node3 12372 1727204085.14068: Calling groups_inventory to load vars for managed-node3 12372 1727204085.14072: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.14083: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.14087: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.14311: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.14637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.14971: done with get_vars() 12372 1727204085.14985: done getting variables 12372 1727204085.15064: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.066) 0:00:12.136 ***** 12372 1727204085.15109: entering _queue_task() for managed-node3/package 12372 1727204085.15433: worker is 1 (out of 1 available) 12372 1727204085.15446: exiting _queue_task() for managed-node3/package 12372 1727204085.15500: done queuing things up, now waiting for results queue to drain 12372 1727204085.15503: waiting for pending results... 12372 1727204085.15909: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12372 1727204085.15933: in run() - task 12b410aa-8751-244a-02f9-000000000175 12372 1727204085.15955: variable 'ansible_search_path' from source: unknown 12372 1727204085.15963: variable 'ansible_search_path' from source: unknown 12372 1727204085.16014: calling self._execute() 12372 1727204085.16150: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.16154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.16157: variable 'omit' from source: magic vars 12372 1727204085.16783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.19588: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.19651: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.19691: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.19727: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.19748: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.19816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.19846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.19869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.19906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.19918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.20036: variable 'ansible_distribution' from source: facts 12372 1727204085.20041: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.20052: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.20056: when evaluation is False, skipping this task 12372 1727204085.20060: _execute() done 12372 1727204085.20064: dumping result to json 12372 1727204085.20069: done dumping result, returning 12372 1727204085.20078: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-244a-02f9-000000000175] 12372 1727204085.20085: sending task result for task 12b410aa-8751-244a-02f9-000000000175 12372 1727204085.20183: done sending task result for task 12b410aa-8751-244a-02f9-000000000175 12372 1727204085.20186: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.20242: no more pending results, returning what we have 12372 1727204085.20246: results queue empty 12372 1727204085.20247: checking for any_errors_fatal 12372 1727204085.20254: done checking for any_errors_fatal 12372 1727204085.20255: checking for max_fail_percentage 12372 1727204085.20256: done checking for max_fail_percentage 12372 1727204085.20257: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.20258: done checking to see if all hosts have failed 12372 1727204085.20259: getting the remaining hosts for this loop 12372 1727204085.20261: done getting the remaining hosts for this loop 12372 1727204085.20265: getting the next task for host managed-node3 12372 1727204085.20273: done getting next task for host managed-node3 12372 1727204085.20276: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204085.20281: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.20302: getting variables 12372 1727204085.20303: in VariableManager get_vars() 12372 1727204085.20360: Calling all_inventory to load vars for managed-node3 12372 1727204085.20364: Calling groups_inventory to load vars for managed-node3 12372 1727204085.20366: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.20375: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.20378: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.20382: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.20561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.20728: done with get_vars() 12372 1727204085.20737: done getting variables 12372 1727204085.20785: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.057) 0:00:12.193 ***** 12372 1727204085.20816: entering _queue_task() for managed-node3/package 12372 1727204085.21058: worker is 1 (out of 1 available) 12372 1727204085.21071: exiting _queue_task() for managed-node3/package 12372 1727204085.21083: done queuing things up, now waiting for results queue to drain 12372 1727204085.21085: waiting for pending results... 12372 1727204085.21509: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12372 1727204085.21556: in run() - task 12b410aa-8751-244a-02f9-000000000176 12372 1727204085.21572: variable 'ansible_search_path' from source: unknown 12372 1727204085.21576: variable 'ansible_search_path' from source: unknown 12372 1727204085.21633: calling self._execute() 12372 1727204085.21739: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.21747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.21760: variable 'omit' from source: magic vars 12372 1727204085.22284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.24094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.24153: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.24183: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.24238: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.24261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.24429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.24433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.24436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.24475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.24694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.24698: variable 'ansible_distribution' from source: facts 12372 1727204085.24701: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.24703: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.24706: when evaluation is False, skipping this task 12372 1727204085.24708: _execute() done 12372 1727204085.24711: dumping result to json 12372 1727204085.24713: done dumping result, returning 12372 1727204085.24715: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-244a-02f9-000000000176] 12372 1727204085.24718: sending task result for task 12b410aa-8751-244a-02f9-000000000176 12372 1727204085.24818: done sending task result for task 12b410aa-8751-244a-02f9-000000000176 12372 1727204085.24821: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.24897: no more pending results, returning what we have 12372 1727204085.24901: results queue empty 12372 1727204085.24902: checking for any_errors_fatal 12372 1727204085.24910: done checking for any_errors_fatal 12372 1727204085.24911: checking for max_fail_percentage 12372 1727204085.24912: done checking for max_fail_percentage 12372 1727204085.24914: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.24915: done checking to see if all hosts have failed 12372 1727204085.24915: getting the remaining hosts for this loop 12372 1727204085.24917: done getting the remaining hosts for this loop 12372 1727204085.24921: getting the next task for host managed-node3 12372 1727204085.24929: done getting next task for host managed-node3 12372 1727204085.24933: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204085.24937: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.24962: getting variables 12372 1727204085.24964: in VariableManager get_vars() 12372 1727204085.25016: Calling all_inventory to load vars for managed-node3 12372 1727204085.25019: Calling groups_inventory to load vars for managed-node3 12372 1727204085.25022: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.25031: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.25036: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.25040: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.25304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.25637: done with get_vars() 12372 1727204085.25651: done getting variables 12372 1727204085.25723: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.049) 0:00:12.243 ***** 12372 1727204085.25763: entering _queue_task() for managed-node3/service 12372 1727204085.26060: worker is 1 (out of 1 available) 12372 1727204085.26077: exiting _queue_task() for managed-node3/service 12372 1727204085.26092: done queuing things up, now waiting for results queue to drain 12372 1727204085.26094: waiting for pending results... 12372 1727204085.26337: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12372 1727204085.26446: in run() - task 12b410aa-8751-244a-02f9-000000000177 12372 1727204085.26460: variable 'ansible_search_path' from source: unknown 12372 1727204085.26463: variable 'ansible_search_path' from source: unknown 12372 1727204085.26501: calling self._execute() 12372 1727204085.26573: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.26580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.26591: variable 'omit' from source: magic vars 12372 1727204085.27019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.29313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.29385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.29442: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.29466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.29495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.29627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.29761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.29765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.29815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.29844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.30041: variable 'ansible_distribution' from source: facts 12372 1727204085.30054: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.30070: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.30077: when evaluation is False, skipping this task 12372 1727204085.30096: _execute() done 12372 1727204085.30099: dumping result to json 12372 1727204085.30194: done dumping result, returning 12372 1727204085.30198: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-244a-02f9-000000000177] 12372 1727204085.30202: sending task result for task 12b410aa-8751-244a-02f9-000000000177 12372 1727204085.30283: done sending task result for task 12b410aa-8751-244a-02f9-000000000177 12372 1727204085.30287: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.30346: no more pending results, returning what we have 12372 1727204085.30351: results queue empty 12372 1727204085.30353: checking for any_errors_fatal 12372 1727204085.30362: done checking for any_errors_fatal 12372 1727204085.30364: checking for max_fail_percentage 12372 1727204085.30366: done checking for max_fail_percentage 12372 1727204085.30367: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.30368: done checking to see if all hosts have failed 12372 1727204085.30369: getting the remaining hosts for this loop 12372 1727204085.30371: done getting the remaining hosts for this loop 12372 1727204085.30376: getting the next task for host managed-node3 12372 1727204085.30385: done getting next task for host managed-node3 12372 1727204085.30392: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204085.30397: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.30423: getting variables 12372 1727204085.30426: in VariableManager get_vars() 12372 1727204085.30666: Calling all_inventory to load vars for managed-node3 12372 1727204085.30670: Calling groups_inventory to load vars for managed-node3 12372 1727204085.30674: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.30684: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.30687: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.30693: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.30904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.31077: done with get_vars() 12372 1727204085.31086: done getting variables 12372 1727204085.31136: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.054) 0:00:12.297 ***** 12372 1727204085.31166: entering _queue_task() for managed-node3/service 12372 1727204085.31387: worker is 1 (out of 1 available) 12372 1727204085.31404: exiting _queue_task() for managed-node3/service 12372 1727204085.31416: done queuing things up, now waiting for results queue to drain 12372 1727204085.31418: waiting for pending results... 12372 1727204085.31608: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12372 1727204085.31716: in run() - task 12b410aa-8751-244a-02f9-000000000178 12372 1727204085.31731: variable 'ansible_search_path' from source: unknown 12372 1727204085.31734: variable 'ansible_search_path' from source: unknown 12372 1727204085.31769: calling self._execute() 12372 1727204085.31843: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.31850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.31864: variable 'omit' from source: magic vars 12372 1727204085.32236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.34410: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.34473: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.34505: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.34542: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.34564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.34638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.34663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.34685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.34719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.34737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.34851: variable 'ansible_distribution' from source: facts 12372 1727204085.34855: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.34868: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.34871: when evaluation is False, skipping this task 12372 1727204085.34874: _execute() done 12372 1727204085.34880: dumping result to json 12372 1727204085.34885: done dumping result, returning 12372 1727204085.34896: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-244a-02f9-000000000178] 12372 1727204085.34901: sending task result for task 12b410aa-8751-244a-02f9-000000000178 12372 1727204085.35002: done sending task result for task 12b410aa-8751-244a-02f9-000000000178 12372 1727204085.35005: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204085.35053: no more pending results, returning what we have 12372 1727204085.35057: results queue empty 12372 1727204085.35058: checking for any_errors_fatal 12372 1727204085.35066: done checking for any_errors_fatal 12372 1727204085.35066: checking for max_fail_percentage 12372 1727204085.35068: done checking for max_fail_percentage 12372 1727204085.35069: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.35070: done checking to see if all hosts have failed 12372 1727204085.35071: getting the remaining hosts for this loop 12372 1727204085.35072: done getting the remaining hosts for this loop 12372 1727204085.35077: getting the next task for host managed-node3 12372 1727204085.35085: done getting next task for host managed-node3 12372 1727204085.35095: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204085.35099: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.35124: getting variables 12372 1727204085.35127: in VariableManager get_vars() 12372 1727204085.35179: Calling all_inventory to load vars for managed-node3 12372 1727204085.35183: Calling groups_inventory to load vars for managed-node3 12372 1727204085.35185: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.35197: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.35199: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.35208: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.35366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.35564: done with get_vars() 12372 1727204085.35574: done getting variables 12372 1727204085.35624: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.044) 0:00:12.342 ***** 12372 1727204085.35654: entering _queue_task() for managed-node3/service 12372 1727204085.35868: worker is 1 (out of 1 available) 12372 1727204085.35883: exiting _queue_task() for managed-node3/service 12372 1727204085.35899: done queuing things up, now waiting for results queue to drain 12372 1727204085.35901: waiting for pending results... 12372 1727204085.36091: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12372 1727204085.36243: in run() - task 12b410aa-8751-244a-02f9-000000000179 12372 1727204085.36249: variable 'ansible_search_path' from source: unknown 12372 1727204085.36252: variable 'ansible_search_path' from source: unknown 12372 1727204085.36274: calling self._execute() 12372 1727204085.36350: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.36355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.36368: variable 'omit' from source: magic vars 12372 1727204085.36736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.39450: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.39533: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.39602: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.39657: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.39712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.39834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.39897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.39996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.40012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.40045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.40255: variable 'ansible_distribution' from source: facts 12372 1727204085.40294: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.40298: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.40300: when evaluation is False, skipping this task 12372 1727204085.40302: _execute() done 12372 1727204085.40305: dumping result to json 12372 1727204085.40314: done dumping result, returning 12372 1727204085.40344: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-244a-02f9-000000000179] 12372 1727204085.40396: sending task result for task 12b410aa-8751-244a-02f9-000000000179 12372 1727204085.40652: done sending task result for task 12b410aa-8751-244a-02f9-000000000179 12372 1727204085.40658: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.40725: no more pending results, returning what we have 12372 1727204085.40730: results queue empty 12372 1727204085.40731: checking for any_errors_fatal 12372 1727204085.40740: done checking for any_errors_fatal 12372 1727204085.40741: checking for max_fail_percentage 12372 1727204085.40744: done checking for max_fail_percentage 12372 1727204085.40745: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.40746: done checking to see if all hosts have failed 12372 1727204085.40747: getting the remaining hosts for this loop 12372 1727204085.40750: done getting the remaining hosts for this loop 12372 1727204085.40755: getting the next task for host managed-node3 12372 1727204085.40764: done getting next task for host managed-node3 12372 1727204085.40772: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204085.40777: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.40803: getting variables 12372 1727204085.40806: in VariableManager get_vars() 12372 1727204085.40875: Calling all_inventory to load vars for managed-node3 12372 1727204085.40881: Calling groups_inventory to load vars for managed-node3 12372 1727204085.40885: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.41100: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.41105: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.41110: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.41405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.41749: done with get_vars() 12372 1727204085.41764: done getting variables 12372 1727204085.41822: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.061) 0:00:12.404 ***** 12372 1727204085.41849: entering _queue_task() for managed-node3/service 12372 1727204085.42066: worker is 1 (out of 1 available) 12372 1727204085.42081: exiting _queue_task() for managed-node3/service 12372 1727204085.42095: done queuing things up, now waiting for results queue to drain 12372 1727204085.42098: waiting for pending results... 12372 1727204085.42280: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 12372 1727204085.42384: in run() - task 12b410aa-8751-244a-02f9-00000000017a 12372 1727204085.42400: variable 'ansible_search_path' from source: unknown 12372 1727204085.42403: variable 'ansible_search_path' from source: unknown 12372 1727204085.42442: calling self._execute() 12372 1727204085.42519: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.42523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.42532: variable 'omit' from source: magic vars 12372 1727204085.42900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.44695: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.44752: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.44782: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.44818: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.44847: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.44912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.44939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.44965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.44999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.45012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.45125: variable 'ansible_distribution' from source: facts 12372 1727204085.45131: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.45142: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.45145: when evaluation is False, skipping this task 12372 1727204085.45150: _execute() done 12372 1727204085.45153: dumping result to json 12372 1727204085.45160: done dumping result, returning 12372 1727204085.45170: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-244a-02f9-00000000017a] 12372 1727204085.45173: sending task result for task 12b410aa-8751-244a-02f9-00000000017a 12372 1727204085.45275: done sending task result for task 12b410aa-8751-244a-02f9-00000000017a 12372 1727204085.45278: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12372 1727204085.45325: no more pending results, returning what we have 12372 1727204085.45329: results queue empty 12372 1727204085.45330: checking for any_errors_fatal 12372 1727204085.45338: done checking for any_errors_fatal 12372 1727204085.45339: checking for max_fail_percentage 12372 1727204085.45341: done checking for max_fail_percentage 12372 1727204085.45342: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.45343: done checking to see if all hosts have failed 12372 1727204085.45344: getting the remaining hosts for this loop 12372 1727204085.45345: done getting the remaining hosts for this loop 12372 1727204085.45350: getting the next task for host managed-node3 12372 1727204085.45358: done getting next task for host managed-node3 12372 1727204085.45362: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204085.45367: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.45386: getting variables 12372 1727204085.45388: in VariableManager get_vars() 12372 1727204085.45451: Calling all_inventory to load vars for managed-node3 12372 1727204085.45454: Calling groups_inventory to load vars for managed-node3 12372 1727204085.45457: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.45466: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.45469: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.45472: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.45667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.45839: done with get_vars() 12372 1727204085.45848: done getting variables 12372 1727204085.45896: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.040) 0:00:12.444 ***** 12372 1727204085.45924: entering _queue_task() for managed-node3/copy 12372 1727204085.46131: worker is 1 (out of 1 available) 12372 1727204085.46148: exiting _queue_task() for managed-node3/copy 12372 1727204085.46159: done queuing things up, now waiting for results queue to drain 12372 1727204085.46161: waiting for pending results... 12372 1727204085.46354: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12372 1727204085.46466: in run() - task 12b410aa-8751-244a-02f9-00000000017b 12372 1727204085.46477: variable 'ansible_search_path' from source: unknown 12372 1727204085.46481: variable 'ansible_search_path' from source: unknown 12372 1727204085.46519: calling self._execute() 12372 1727204085.46593: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.46598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.46612: variable 'omit' from source: magic vars 12372 1727204085.46978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.48750: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.48813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.48849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.48878: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.48905: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.48975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.49002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.49027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.49064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.49077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.49192: variable 'ansible_distribution' from source: facts 12372 1727204085.49199: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.49210: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.49213: when evaluation is False, skipping this task 12372 1727204085.49216: _execute() done 12372 1727204085.49225: dumping result to json 12372 1727204085.49227: done dumping result, returning 12372 1727204085.49235: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-244a-02f9-00000000017b] 12372 1727204085.49241: sending task result for task 12b410aa-8751-244a-02f9-00000000017b 12372 1727204085.49338: done sending task result for task 12b410aa-8751-244a-02f9-00000000017b 12372 1727204085.49341: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.49393: no more pending results, returning what we have 12372 1727204085.49397: results queue empty 12372 1727204085.49398: checking for any_errors_fatal 12372 1727204085.49406: done checking for any_errors_fatal 12372 1727204085.49407: checking for max_fail_percentage 12372 1727204085.49409: done checking for max_fail_percentage 12372 1727204085.49410: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.49411: done checking to see if all hosts have failed 12372 1727204085.49412: getting the remaining hosts for this loop 12372 1727204085.49413: done getting the remaining hosts for this loop 12372 1727204085.49418: getting the next task for host managed-node3 12372 1727204085.49425: done getting next task for host managed-node3 12372 1727204085.49429: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204085.49433: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.49451: getting variables 12372 1727204085.49453: in VariableManager get_vars() 12372 1727204085.49508: Calling all_inventory to load vars for managed-node3 12372 1727204085.49512: Calling groups_inventory to load vars for managed-node3 12372 1727204085.49515: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.49524: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.49527: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.49531: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.49677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.49862: done with get_vars() 12372 1727204085.49871: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.040) 0:00:12.485 ***** 12372 1727204085.49944: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204085.50147: worker is 1 (out of 1 available) 12372 1727204085.50163: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 12372 1727204085.50176: done queuing things up, now waiting for results queue to drain 12372 1727204085.50178: waiting for pending results... 12372 1727204085.50506: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12372 1727204085.50537: in run() - task 12b410aa-8751-244a-02f9-00000000017c 12372 1727204085.50558: variable 'ansible_search_path' from source: unknown 12372 1727204085.50566: variable 'ansible_search_path' from source: unknown 12372 1727204085.50612: calling self._execute() 12372 1727204085.50720: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.50737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.50755: variable 'omit' from source: magic vars 12372 1727204085.51280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.53178: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.53238: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.53269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.53301: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.53329: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.53395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.53421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.53449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.53481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.53496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.53608: variable 'ansible_distribution' from source: facts 12372 1727204085.53613: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.53626: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.53629: when evaluation is False, skipping this task 12372 1727204085.53632: _execute() done 12372 1727204085.53637: dumping result to json 12372 1727204085.53646: done dumping result, returning 12372 1727204085.53653: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-244a-02f9-00000000017c] 12372 1727204085.53659: sending task result for task 12b410aa-8751-244a-02f9-00000000017c 12372 1727204085.53965: done sending task result for task 12b410aa-8751-244a-02f9-00000000017c 12372 1727204085.53969: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.54136: no more pending results, returning what we have 12372 1727204085.54139: results queue empty 12372 1727204085.54140: checking for any_errors_fatal 12372 1727204085.54147: done checking for any_errors_fatal 12372 1727204085.54148: checking for max_fail_percentage 12372 1727204085.54150: done checking for max_fail_percentage 12372 1727204085.54151: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.54152: done checking to see if all hosts have failed 12372 1727204085.54153: getting the remaining hosts for this loop 12372 1727204085.54154: done getting the remaining hosts for this loop 12372 1727204085.54157: getting the next task for host managed-node3 12372 1727204085.54164: done getting next task for host managed-node3 12372 1727204085.54167: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204085.54172: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.54191: getting variables 12372 1727204085.54192: in VariableManager get_vars() 12372 1727204085.54241: Calling all_inventory to load vars for managed-node3 12372 1727204085.54244: Calling groups_inventory to load vars for managed-node3 12372 1727204085.54246: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.54255: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.54259: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.54262: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.54669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.54975: done with get_vars() 12372 1727204085.54987: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.051) 0:00:12.536 ***** 12372 1727204085.55088: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204085.55363: worker is 1 (out of 1 available) 12372 1727204085.55378: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 12372 1727204085.55593: done queuing things up, now waiting for results queue to drain 12372 1727204085.55596: waiting for pending results... 12372 1727204085.55730: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 12372 1727204085.55910: in run() - task 12b410aa-8751-244a-02f9-00000000017d 12372 1727204085.55941: variable 'ansible_search_path' from source: unknown 12372 1727204085.55952: variable 'ansible_search_path' from source: unknown 12372 1727204085.56003: calling self._execute() 12372 1727204085.56123: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.56139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.56161: variable 'omit' from source: magic vars 12372 1727204085.56731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.59484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.59577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.59626: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.59675: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.59712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.59995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.59999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.60003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.60006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.60009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.60170: variable 'ansible_distribution' from source: facts 12372 1727204085.60183: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.60204: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.60213: when evaluation is False, skipping this task 12372 1727204085.60225: _execute() done 12372 1727204085.60238: dumping result to json 12372 1727204085.60248: done dumping result, returning 12372 1727204085.60263: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-244a-02f9-00000000017d] 12372 1727204085.60275: sending task result for task 12b410aa-8751-244a-02f9-00000000017d skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.60564: no more pending results, returning what we have 12372 1727204085.60568: results queue empty 12372 1727204085.60569: checking for any_errors_fatal 12372 1727204085.60578: done checking for any_errors_fatal 12372 1727204085.60580: checking for max_fail_percentage 12372 1727204085.60582: done checking for max_fail_percentage 12372 1727204085.60583: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.60584: done checking to see if all hosts have failed 12372 1727204085.60586: getting the remaining hosts for this loop 12372 1727204085.60587: done getting the remaining hosts for this loop 12372 1727204085.60594: getting the next task for host managed-node3 12372 1727204085.60604: done getting next task for host managed-node3 12372 1727204085.60609: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204085.60614: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.60641: getting variables 12372 1727204085.60644: in VariableManager get_vars() 12372 1727204085.60911: Calling all_inventory to load vars for managed-node3 12372 1727204085.60915: Calling groups_inventory to load vars for managed-node3 12372 1727204085.60921: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.60932: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.60935: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.60939: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.61181: done sending task result for task 12b410aa-8751-244a-02f9-00000000017d 12372 1727204085.61184: WORKER PROCESS EXITING 12372 1727204085.61215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.61522: done with get_vars() 12372 1727204085.61535: done getting variables 12372 1727204085.61608: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.065) 0:00:12.602 ***** 12372 1727204085.61653: entering _queue_task() for managed-node3/debug 12372 1727204085.61944: worker is 1 (out of 1 available) 12372 1727204085.61959: exiting _queue_task() for managed-node3/debug 12372 1727204085.61973: done queuing things up, now waiting for results queue to drain 12372 1727204085.61975: waiting for pending results... 12372 1727204085.62298: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12372 1727204085.62474: in run() - task 12b410aa-8751-244a-02f9-00000000017e 12372 1727204085.62501: variable 'ansible_search_path' from source: unknown 12372 1727204085.62509: variable 'ansible_search_path' from source: unknown 12372 1727204085.62561: calling self._execute() 12372 1727204085.62670: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.62684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.62704: variable 'omit' from source: magic vars 12372 1727204085.63305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.65316: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.65319: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.65322: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.65347: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.65385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.65495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.65537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.65581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.65640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.65668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.65839: variable 'ansible_distribution' from source: facts 12372 1727204085.65852: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.65887: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.65900: when evaluation is False, skipping this task 12372 1727204085.65917: _execute() done 12372 1727204085.65927: dumping result to json 12372 1727204085.65937: done dumping result, returning 12372 1727204085.65973: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-244a-02f9-00000000017e] 12372 1727204085.65988: sending task result for task 12b410aa-8751-244a-02f9-00000000017e skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204085.66174: no more pending results, returning what we have 12372 1727204085.66178: results queue empty 12372 1727204085.66179: checking for any_errors_fatal 12372 1727204085.66186: done checking for any_errors_fatal 12372 1727204085.66186: checking for max_fail_percentage 12372 1727204085.66193: done checking for max_fail_percentage 12372 1727204085.66194: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.66195: done checking to see if all hosts have failed 12372 1727204085.66197: getting the remaining hosts for this loop 12372 1727204085.66198: done getting the remaining hosts for this loop 12372 1727204085.66203: getting the next task for host managed-node3 12372 1727204085.66211: done getting next task for host managed-node3 12372 1727204085.66216: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204085.66220: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.66241: getting variables 12372 1727204085.66243: in VariableManager get_vars() 12372 1727204085.66321: Calling all_inventory to load vars for managed-node3 12372 1727204085.66325: Calling groups_inventory to load vars for managed-node3 12372 1727204085.66328: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.66335: done sending task result for task 12b410aa-8751-244a-02f9-00000000017e 12372 1727204085.66338: WORKER PROCESS EXITING 12372 1727204085.66348: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.66352: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.66356: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.66579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.66756: done with get_vars() 12372 1727204085.66765: done getting variables 12372 1727204085.66817: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.051) 0:00:12.654 ***** 12372 1727204085.66846: entering _queue_task() for managed-node3/debug 12372 1727204085.67065: worker is 1 (out of 1 available) 12372 1727204085.67081: exiting _queue_task() for managed-node3/debug 12372 1727204085.67095: done queuing things up, now waiting for results queue to drain 12372 1727204085.67097: waiting for pending results... 12372 1727204085.67285: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12372 1727204085.67393: in run() - task 12b410aa-8751-244a-02f9-00000000017f 12372 1727204085.67406: variable 'ansible_search_path' from source: unknown 12372 1727204085.67409: variable 'ansible_search_path' from source: unknown 12372 1727204085.67447: calling self._execute() 12372 1727204085.67518: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.67528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.67539: variable 'omit' from source: magic vars 12372 1727204085.67905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.70275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.70334: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.70370: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.70402: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.70428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.70502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.70532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.70554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.70592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.70606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.70727: variable 'ansible_distribution' from source: facts 12372 1727204085.70733: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.70745: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.70748: when evaluation is False, skipping this task 12372 1727204085.70752: _execute() done 12372 1727204085.70757: dumping result to json 12372 1727204085.70762: done dumping result, returning 12372 1727204085.70772: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-244a-02f9-00000000017f] 12372 1727204085.70783: sending task result for task 12b410aa-8751-244a-02f9-00000000017f 12372 1727204085.70877: done sending task result for task 12b410aa-8751-244a-02f9-00000000017f 12372 1727204085.70881: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204085.70944: no more pending results, returning what we have 12372 1727204085.70949: results queue empty 12372 1727204085.70951: checking for any_errors_fatal 12372 1727204085.70957: done checking for any_errors_fatal 12372 1727204085.70957: checking for max_fail_percentage 12372 1727204085.70959: done checking for max_fail_percentage 12372 1727204085.70961: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.70962: done checking to see if all hosts have failed 12372 1727204085.70962: getting the remaining hosts for this loop 12372 1727204085.70964: done getting the remaining hosts for this loop 12372 1727204085.70968: getting the next task for host managed-node3 12372 1727204085.70976: done getting next task for host managed-node3 12372 1727204085.70980: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204085.70985: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.71012: getting variables 12372 1727204085.71014: in VariableManager get_vars() 12372 1727204085.71066: Calling all_inventory to load vars for managed-node3 12372 1727204085.71070: Calling groups_inventory to load vars for managed-node3 12372 1727204085.71072: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.71082: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.71085: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.71088: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.71252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.71424: done with get_vars() 12372 1727204085.71434: done getting variables 12372 1727204085.71485: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.046) 0:00:12.700 ***** 12372 1727204085.71515: entering _queue_task() for managed-node3/debug 12372 1727204085.71818: worker is 1 (out of 1 available) 12372 1727204085.71834: exiting _queue_task() for managed-node3/debug 12372 1727204085.71848: done queuing things up, now waiting for results queue to drain 12372 1727204085.71850: waiting for pending results... 12372 1727204085.72209: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12372 1727204085.72346: in run() - task 12b410aa-8751-244a-02f9-000000000180 12372 1727204085.72495: variable 'ansible_search_path' from source: unknown 12372 1727204085.72499: variable 'ansible_search_path' from source: unknown 12372 1727204085.72501: calling self._execute() 12372 1727204085.72533: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.72551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.72571: variable 'omit' from source: magic vars 12372 1727204085.73157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.75228: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.75283: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.75532: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.75563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.75587: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.75661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.75687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.75712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.75749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.75762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.75879: variable 'ansible_distribution' from source: facts 12372 1727204085.75886: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.75898: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.75903: when evaluation is False, skipping this task 12372 1727204085.75906: _execute() done 12372 1727204085.75908: dumping result to json 12372 1727204085.75914: done dumping result, returning 12372 1727204085.75925: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-244a-02f9-000000000180] 12372 1727204085.75930: sending task result for task 12b410aa-8751-244a-02f9-000000000180 12372 1727204085.76030: done sending task result for task 12b410aa-8751-244a-02f9-000000000180 12372 1727204085.76033: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12372 1727204085.76086: no more pending results, returning what we have 12372 1727204085.76091: results queue empty 12372 1727204085.76093: checking for any_errors_fatal 12372 1727204085.76101: done checking for any_errors_fatal 12372 1727204085.76101: checking for max_fail_percentage 12372 1727204085.76103: done checking for max_fail_percentage 12372 1727204085.76104: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.76105: done checking to see if all hosts have failed 12372 1727204085.76106: getting the remaining hosts for this loop 12372 1727204085.76108: done getting the remaining hosts for this loop 12372 1727204085.76113: getting the next task for host managed-node3 12372 1727204085.76120: done getting next task for host managed-node3 12372 1727204085.76125: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204085.76129: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.76149: getting variables 12372 1727204085.76151: in VariableManager get_vars() 12372 1727204085.76211: Calling all_inventory to load vars for managed-node3 12372 1727204085.76214: Calling groups_inventory to load vars for managed-node3 12372 1727204085.76220: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.76235: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.76240: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.76244: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.76541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.76855: done with get_vars() 12372 1727204085.76868: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.054) 0:00:12.755 ***** 12372 1727204085.76994: entering _queue_task() for managed-node3/ping 12372 1727204085.77402: worker is 1 (out of 1 available) 12372 1727204085.77461: exiting _queue_task() for managed-node3/ping 12372 1727204085.77475: done queuing things up, now waiting for results queue to drain 12372 1727204085.77477: waiting for pending results... 12372 1727204085.77604: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12372 1727204085.77712: in run() - task 12b410aa-8751-244a-02f9-000000000181 12372 1727204085.77728: variable 'ansible_search_path' from source: unknown 12372 1727204085.77734: variable 'ansible_search_path' from source: unknown 12372 1727204085.77765: calling self._execute() 12372 1727204085.77841: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.77850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.77859: variable 'omit' from source: magic vars 12372 1727204085.78275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.80183: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.80238: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.80269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.80300: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.80324: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.80400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.80425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.80449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.80484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.80499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.80612: variable 'ansible_distribution' from source: facts 12372 1727204085.80621: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.80630: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.80633: when evaluation is False, skipping this task 12372 1727204085.80635: _execute() done 12372 1727204085.80640: dumping result to json 12372 1727204085.80645: done dumping result, returning 12372 1727204085.80653: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-244a-02f9-000000000181] 12372 1727204085.80658: sending task result for task 12b410aa-8751-244a-02f9-000000000181 12372 1727204085.80750: done sending task result for task 12b410aa-8751-244a-02f9-000000000181 12372 1727204085.80753: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.80834: no more pending results, returning what we have 12372 1727204085.80838: results queue empty 12372 1727204085.80839: checking for any_errors_fatal 12372 1727204085.80846: done checking for any_errors_fatal 12372 1727204085.80847: checking for max_fail_percentage 12372 1727204085.80848: done checking for max_fail_percentage 12372 1727204085.80850: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.80851: done checking to see if all hosts have failed 12372 1727204085.80851: getting the remaining hosts for this loop 12372 1727204085.80853: done getting the remaining hosts for this loop 12372 1727204085.80857: getting the next task for host managed-node3 12372 1727204085.80867: done getting next task for host managed-node3 12372 1727204085.80870: ^ task is: TASK: meta (role_complete) 12372 1727204085.80874: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.80895: getting variables 12372 1727204085.80898: in VariableManager get_vars() 12372 1727204085.80950: Calling all_inventory to load vars for managed-node3 12372 1727204085.80953: Calling groups_inventory to load vars for managed-node3 12372 1727204085.80956: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.80965: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.80968: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.80971: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.81163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.81358: done with get_vars() 12372 1727204085.81366: done getting variables 12372 1727204085.81434: done queuing things up, now waiting for results queue to drain 12372 1727204085.81436: results queue empty 12372 1727204085.81437: checking for any_errors_fatal 12372 1727204085.81438: done checking for any_errors_fatal 12372 1727204085.81439: checking for max_fail_percentage 12372 1727204085.81440: done checking for max_fail_percentage 12372 1727204085.81440: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.81441: done checking to see if all hosts have failed 12372 1727204085.81442: getting the remaining hosts for this loop 12372 1727204085.81442: done getting the remaining hosts for this loop 12372 1727204085.81444: getting the next task for host managed-node3 12372 1727204085.81447: done getting next task for host managed-node3 12372 1727204085.81449: ^ task is: TASK: Delete the device '{{ controller_device }}' 12372 1727204085.81450: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.81453: getting variables 12372 1727204085.81453: in VariableManager get_vars() 12372 1727204085.81473: Calling all_inventory to load vars for managed-node3 12372 1727204085.81475: Calling groups_inventory to load vars for managed-node3 12372 1727204085.81476: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.81481: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.81484: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.81486: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.81601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.81754: done with get_vars() 12372 1727204085.81762: done getting variables 12372 1727204085.81795: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12372 1727204085.81900: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.049) 0:00:12.804 ***** 12372 1727204085.81927: entering _queue_task() for managed-node3/command 12372 1727204085.82141: worker is 1 (out of 1 available) 12372 1727204085.82157: exiting _queue_task() for managed-node3/command 12372 1727204085.82172: done queuing things up, now waiting for results queue to drain 12372 1727204085.82174: waiting for pending results... 12372 1727204085.82356: running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' 12372 1727204085.82438: in run() - task 12b410aa-8751-244a-02f9-0000000001b1 12372 1727204085.82450: variable 'ansible_search_path' from source: unknown 12372 1727204085.82481: calling self._execute() 12372 1727204085.82556: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.82562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.82572: variable 'omit' from source: magic vars 12372 1727204085.82940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.84933: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.84982: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.85016: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.85051: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.85074: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.85153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.85178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.85202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.85236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.85252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.85363: variable 'ansible_distribution' from source: facts 12372 1727204085.85368: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.85380: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.85384: when evaluation is False, skipping this task 12372 1727204085.85387: _execute() done 12372 1727204085.85391: dumping result to json 12372 1727204085.85396: done dumping result, returning 12372 1727204085.85404: done running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' [12b410aa-8751-244a-02f9-0000000001b1] 12372 1727204085.85409: sending task result for task 12b410aa-8751-244a-02f9-0000000001b1 12372 1727204085.85498: done sending task result for task 12b410aa-8751-244a-02f9-0000000001b1 12372 1727204085.85501: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.85553: no more pending results, returning what we have 12372 1727204085.85557: results queue empty 12372 1727204085.85558: checking for any_errors_fatal 12372 1727204085.85560: done checking for any_errors_fatal 12372 1727204085.85561: checking for max_fail_percentage 12372 1727204085.85562: done checking for max_fail_percentage 12372 1727204085.85563: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.85564: done checking to see if all hosts have failed 12372 1727204085.85565: getting the remaining hosts for this loop 12372 1727204085.85567: done getting the remaining hosts for this loop 12372 1727204085.85571: getting the next task for host managed-node3 12372 1727204085.85580: done getting next task for host managed-node3 12372 1727204085.85583: ^ task is: TASK: Remove test interfaces 12372 1727204085.85587: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.85599: getting variables 12372 1727204085.85601: in VariableManager get_vars() 12372 1727204085.85656: Calling all_inventory to load vars for managed-node3 12372 1727204085.85659: Calling groups_inventory to load vars for managed-node3 12372 1727204085.85662: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.85672: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.85674: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.85677: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.85881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.86056: done with get_vars() 12372 1727204085.86065: done getting variables 12372 1727204085.86113: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.042) 0:00:12.847 ***** 12372 1727204085.86142: entering _queue_task() for managed-node3/shell 12372 1727204085.86344: worker is 1 (out of 1 available) 12372 1727204085.86359: exiting _queue_task() for managed-node3/shell 12372 1727204085.86374: done queuing things up, now waiting for results queue to drain 12372 1727204085.86376: waiting for pending results... 12372 1727204085.86565: running TaskExecutor() for managed-node3/TASK: Remove test interfaces 12372 1727204085.86666: in run() - task 12b410aa-8751-244a-02f9-0000000001b5 12372 1727204085.86677: variable 'ansible_search_path' from source: unknown 12372 1727204085.86681: variable 'ansible_search_path' from source: unknown 12372 1727204085.86716: calling self._execute() 12372 1727204085.86791: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.86797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.86808: variable 'omit' from source: magic vars 12372 1727204085.87184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.89136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.89188: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.89235: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.89267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.89291: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.89362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.89387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.89410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.89448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.89464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.89572: variable 'ansible_distribution' from source: facts 12372 1727204085.89578: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.89590: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.89594: when evaluation is False, skipping this task 12372 1727204085.89596: _execute() done 12372 1727204085.89600: dumping result to json 12372 1727204085.89606: done dumping result, returning 12372 1727204085.89613: done running TaskExecutor() for managed-node3/TASK: Remove test interfaces [12b410aa-8751-244a-02f9-0000000001b5] 12372 1727204085.89621: sending task result for task 12b410aa-8751-244a-02f9-0000000001b5 12372 1727204085.89713: done sending task result for task 12b410aa-8751-244a-02f9-0000000001b5 12372 1727204085.89716: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.89764: no more pending results, returning what we have 12372 1727204085.89768: results queue empty 12372 1727204085.89769: checking for any_errors_fatal 12372 1727204085.89776: done checking for any_errors_fatal 12372 1727204085.89777: checking for max_fail_percentage 12372 1727204085.89779: done checking for max_fail_percentage 12372 1727204085.89780: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.89781: done checking to see if all hosts have failed 12372 1727204085.89782: getting the remaining hosts for this loop 12372 1727204085.89783: done getting the remaining hosts for this loop 12372 1727204085.89788: getting the next task for host managed-node3 12372 1727204085.89796: done getting next task for host managed-node3 12372 1727204085.89799: ^ task is: TASK: Stop dnsmasq/radvd services 12372 1727204085.89803: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.89807: getting variables 12372 1727204085.89809: in VariableManager get_vars() 12372 1727204085.89858: Calling all_inventory to load vars for managed-node3 12372 1727204085.89861: Calling groups_inventory to load vars for managed-node3 12372 1727204085.89864: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.89873: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.89876: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.89879: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.90042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.90207: done with get_vars() 12372 1727204085.90216: done getting variables 12372 1727204085.90264: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.041) 0:00:12.888 ***** 12372 1727204085.90291: entering _queue_task() for managed-node3/shell 12372 1727204085.90500: worker is 1 (out of 1 available) 12372 1727204085.90514: exiting _queue_task() for managed-node3/shell 12372 1727204085.90528: done queuing things up, now waiting for results queue to drain 12372 1727204085.90530: waiting for pending results... 12372 1727204085.90724: running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services 12372 1727204085.90815: in run() - task 12b410aa-8751-244a-02f9-0000000001b6 12372 1727204085.90827: variable 'ansible_search_path' from source: unknown 12372 1727204085.90831: variable 'ansible_search_path' from source: unknown 12372 1727204085.90865: calling self._execute() 12372 1727204085.90937: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.90943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.90954: variable 'omit' from source: magic vars 12372 1727204085.91326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.93266: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.93320: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.93348: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.93381: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.93407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.93473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.93502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.93524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.93557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.93570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.93677: variable 'ansible_distribution' from source: facts 12372 1727204085.93683: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.93696: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.93700: when evaluation is False, skipping this task 12372 1727204085.93706: _execute() done 12372 1727204085.93709: dumping result to json 12372 1727204085.93721: done dumping result, returning 12372 1727204085.93724: done running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services [12b410aa-8751-244a-02f9-0000000001b6] 12372 1727204085.93726: sending task result for task 12b410aa-8751-244a-02f9-0000000001b6 12372 1727204085.93815: done sending task result for task 12b410aa-8751-244a-02f9-0000000001b6 12372 1727204085.93822: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.93868: no more pending results, returning what we have 12372 1727204085.93872: results queue empty 12372 1727204085.93873: checking for any_errors_fatal 12372 1727204085.93881: done checking for any_errors_fatal 12372 1727204085.93882: checking for max_fail_percentage 12372 1727204085.93884: done checking for max_fail_percentage 12372 1727204085.93885: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.93886: done checking to see if all hosts have failed 12372 1727204085.93887: getting the remaining hosts for this loop 12372 1727204085.93888: done getting the remaining hosts for this loop 12372 1727204085.93895: getting the next task for host managed-node3 12372 1727204085.93904: done getting next task for host managed-node3 12372 1727204085.93907: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 12372 1727204085.93911: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.93915: getting variables 12372 1727204085.93919: in VariableManager get_vars() 12372 1727204085.93968: Calling all_inventory to load vars for managed-node3 12372 1727204085.93971: Calling groups_inventory to load vars for managed-node3 12372 1727204085.93974: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.93984: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.93987: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.93999: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.94366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.94525: done with get_vars() 12372 1727204085.94534: done getting variables 12372 1727204085.94581: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.043) 0:00:12.931 ***** 12372 1727204085.94605: entering _queue_task() for managed-node3/command 12372 1727204085.94824: worker is 1 (out of 1 available) 12372 1727204085.94838: exiting _queue_task() for managed-node3/command 12372 1727204085.94852: done queuing things up, now waiting for results queue to drain 12372 1727204085.94854: waiting for pending results... 12372 1727204085.95043: running TaskExecutor() for managed-node3/TASK: Restore the /etc/resolv.conf for initscript 12372 1727204085.95139: in run() - task 12b410aa-8751-244a-02f9-0000000001b7 12372 1727204085.95151: variable 'ansible_search_path' from source: unknown 12372 1727204085.95185: calling self._execute() 12372 1727204085.95287: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.95296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.95306: variable 'omit' from source: magic vars 12372 1727204085.95670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204085.97684: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204085.97896: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204085.97901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204085.97904: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204085.97913: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204085.98021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204085.98066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204085.98112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204085.98173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204085.98189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204085.98306: variable 'ansible_distribution' from source: facts 12372 1727204085.98317: variable 'ansible_distribution_major_version' from source: facts 12372 1727204085.98335: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204085.98339: when evaluation is False, skipping this task 12372 1727204085.98342: _execute() done 12372 1727204085.98344: dumping result to json 12372 1727204085.98350: done dumping result, returning 12372 1727204085.98358: done running TaskExecutor() for managed-node3/TASK: Restore the /etc/resolv.conf for initscript [12b410aa-8751-244a-02f9-0000000001b7] 12372 1727204085.98363: sending task result for task 12b410aa-8751-244a-02f9-0000000001b7 12372 1727204085.98462: done sending task result for task 12b410aa-8751-244a-02f9-0000000001b7 12372 1727204085.98465: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204085.98515: no more pending results, returning what we have 12372 1727204085.98522: results queue empty 12372 1727204085.98523: checking for any_errors_fatal 12372 1727204085.98532: done checking for any_errors_fatal 12372 1727204085.98533: checking for max_fail_percentage 12372 1727204085.98535: done checking for max_fail_percentage 12372 1727204085.98536: checking to see if all hosts have failed and the running result is not ok 12372 1727204085.98537: done checking to see if all hosts have failed 12372 1727204085.98538: getting the remaining hosts for this loop 12372 1727204085.98539: done getting the remaining hosts for this loop 12372 1727204085.98543: getting the next task for host managed-node3 12372 1727204085.98550: done getting next task for host managed-node3 12372 1727204085.98554: ^ task is: TASK: Verify network state restored to default 12372 1727204085.98557: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12372 1727204085.98561: getting variables 12372 1727204085.98563: in VariableManager get_vars() 12372 1727204085.98620: Calling all_inventory to load vars for managed-node3 12372 1727204085.98624: Calling groups_inventory to load vars for managed-node3 12372 1727204085.98627: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204085.98637: Calling all_plugins_play to load vars for managed-node3 12372 1727204085.98639: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204085.98643: Calling groups_plugins_play to load vars for managed-node3 12372 1727204085.98806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204085.98976: done with get_vars() 12372 1727204085.98986: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.044) 0:00:12.976 ***** 12372 1727204085.99067: entering _queue_task() for managed-node3/include_tasks 12372 1727204085.99286: worker is 1 (out of 1 available) 12372 1727204085.99303: exiting _queue_task() for managed-node3/include_tasks 12372 1727204085.99322: done queuing things up, now waiting for results queue to drain 12372 1727204085.99324: waiting for pending results... 12372 1727204085.99501: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 12372 1727204085.99598: in run() - task 12b410aa-8751-244a-02f9-0000000001b8 12372 1727204085.99609: variable 'ansible_search_path' from source: unknown 12372 1727204085.99643: calling self._execute() 12372 1727204085.99715: variable 'ansible_host' from source: host vars for 'managed-node3' 12372 1727204085.99722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 12372 1727204085.99733: variable 'omit' from source: magic vars 12372 1727204086.00099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12372 1727204086.02796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12372 1727204086.02801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12372 1727204086.02804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12372 1727204086.02848: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12372 1727204086.02888: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12372 1727204086.02993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12372 1727204086.03038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12372 1727204086.03075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12372 1727204086.03139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12372 1727204086.03164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12372 1727204086.03359: variable 'ansible_distribution' from source: facts 12372 1727204086.03372: variable 'ansible_distribution_major_version' from source: facts 12372 1727204086.03392: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12372 1727204086.03402: when evaluation is False, skipping this task 12372 1727204086.03414: _execute() done 12372 1727204086.03423: dumping result to json 12372 1727204086.03433: done dumping result, returning 12372 1727204086.03445: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [12b410aa-8751-244a-02f9-0000000001b8] 12372 1727204086.03456: sending task result for task 12b410aa-8751-244a-02f9-0000000001b8 12372 1727204086.03608: done sending task result for task 12b410aa-8751-244a-02f9-0000000001b8 12372 1727204086.03611: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12372 1727204086.03684: no more pending results, returning what we have 12372 1727204086.03691: results queue empty 12372 1727204086.03692: checking for any_errors_fatal 12372 1727204086.03699: done checking for any_errors_fatal 12372 1727204086.03700: checking for max_fail_percentage 12372 1727204086.03702: done checking for max_fail_percentage 12372 1727204086.03704: checking to see if all hosts have failed and the running result is not ok 12372 1727204086.03705: done checking to see if all hosts have failed 12372 1727204086.03706: getting the remaining hosts for this loop 12372 1727204086.03708: done getting the remaining hosts for this loop 12372 1727204086.03713: getting the next task for host managed-node3 12372 1727204086.03724: done getting next task for host managed-node3 12372 1727204086.03727: ^ task is: TASK: meta (flush_handlers) 12372 1727204086.03730: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204086.03734: getting variables 12372 1727204086.03737: in VariableManager get_vars() 12372 1727204086.03936: Calling all_inventory to load vars for managed-node3 12372 1727204086.03940: Calling groups_inventory to load vars for managed-node3 12372 1727204086.03943: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204086.03957: Calling all_plugins_play to load vars for managed-node3 12372 1727204086.03961: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204086.03966: Calling groups_plugins_play to load vars for managed-node3 12372 1727204086.04439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204086.04721: done with get_vars() 12372 1727204086.04734: done getting variables 12372 1727204086.04817: in VariableManager get_vars() 12372 1727204086.04842: Calling all_inventory to load vars for managed-node3 12372 1727204086.04844: Calling groups_inventory to load vars for managed-node3 12372 1727204086.04847: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204086.04853: Calling all_plugins_play to load vars for managed-node3 12372 1727204086.04857: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204086.04861: Calling groups_plugins_play to load vars for managed-node3 12372 1727204086.05056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204086.05329: done with get_vars() 12372 1727204086.05346: done queuing things up, now waiting for results queue to drain 12372 1727204086.05348: results queue empty 12372 1727204086.05349: checking for any_errors_fatal 12372 1727204086.05352: done checking for any_errors_fatal 12372 1727204086.05353: checking for max_fail_percentage 12372 1727204086.05354: done checking for max_fail_percentage 12372 1727204086.05355: checking to see if all hosts have failed and the running result is not ok 12372 1727204086.05356: done checking to see if all hosts have failed 12372 1727204086.05357: getting the remaining hosts for this loop 12372 1727204086.05358: done getting the remaining hosts for this loop 12372 1727204086.05361: getting the next task for host managed-node3 12372 1727204086.05365: done getting next task for host managed-node3 12372 1727204086.05367: ^ task is: TASK: meta (flush_handlers) 12372 1727204086.05369: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204086.05373: getting variables 12372 1727204086.05374: in VariableManager get_vars() 12372 1727204086.05398: Calling all_inventory to load vars for managed-node3 12372 1727204086.05401: Calling groups_inventory to load vars for managed-node3 12372 1727204086.05404: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204086.05416: Calling all_plugins_play to load vars for managed-node3 12372 1727204086.05419: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204086.05423: Calling groups_plugins_play to load vars for managed-node3 12372 1727204086.05665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204086.05922: done with get_vars() 12372 1727204086.05932: done getting variables 12372 1727204086.05987: in VariableManager get_vars() 12372 1727204086.06010: Calling all_inventory to load vars for managed-node3 12372 1727204086.06012: Calling groups_inventory to load vars for managed-node3 12372 1727204086.06015: Calling all_plugins_inventory to load vars for managed-node3 12372 1727204086.06020: Calling all_plugins_play to load vars for managed-node3 12372 1727204086.06022: Calling groups_plugins_inventory to load vars for managed-node3 12372 1727204086.06026: Calling groups_plugins_play to load vars for managed-node3 12372 1727204086.06211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12372 1727204086.06487: done with get_vars() 12372 1727204086.06502: done queuing things up, now waiting for results queue to drain 12372 1727204086.06504: results queue empty 12372 1727204086.06505: checking for any_errors_fatal 12372 1727204086.06507: done checking for any_errors_fatal 12372 1727204086.06508: checking for max_fail_percentage 12372 1727204086.06509: done checking for max_fail_percentage 12372 1727204086.06510: checking to see if all hosts have failed and the running result is not ok 12372 1727204086.06511: done checking to see if all hosts have failed 12372 1727204086.06512: getting the remaining hosts for this loop 12372 1727204086.06513: done getting the remaining hosts for this loop 12372 1727204086.06515: getting the next task for host managed-node3 12372 1727204086.06519: done getting next task for host managed-node3 12372 1727204086.06520: ^ task is: None 12372 1727204086.06521: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12372 1727204086.06523: done queuing things up, now waiting for results queue to drain 12372 1727204086.06524: results queue empty 12372 1727204086.06525: checking for any_errors_fatal 12372 1727204086.06526: done checking for any_errors_fatal 12372 1727204086.06527: checking for max_fail_percentage 12372 1727204086.06528: done checking for max_fail_percentage 12372 1727204086.06529: checking to see if all hosts have failed and the running result is not ok 12372 1727204086.06530: done checking to see if all hosts have failed 12372 1727204086.06532: getting the next task for host managed-node3 12372 1727204086.06535: done getting next task for host managed-node3 12372 1727204086.06536: ^ task is: None 12372 1727204086.06537: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=7 changed=0 unreachable=0 failed=0 skipped=151 rescued=0 ignored=0 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.075) 0:00:13.051 ***** =============================================================================== Gathering Facts --------------------------------------------------------- 1.80s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:5 Gather the minimum subset of ansible_facts required by the network role test --- 1.01s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.87s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Assert that the interface is present - 'test2' -------------------------- 0.16s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider --- 0.15s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Backup the /etc/resolv.conf for initscript ------------------------------ 0.14s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 fedora.linux_system_roles.network : Print network provider -------------- 0.13s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Include the task 'get_interface_stat.yml' ------------------------------- 0.13s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 --- 0.12s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces --- 0.12s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 fedora.linux_system_roles.network : Ensure ansible_facts used by role --- 0.12s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable --- 0.11s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 TEST Add Bond with 2 ports ---------------------------------------------- 0.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Assert that the interface is present - 'test1' -------------------------- 0.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later --- 0.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Include the task 'enable_epel.yml' -------------------------------------- 0.09s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Include the task 'get_interface_stat.yml' ------------------------------- 0.09s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Include the task 'assert_profile_present.yml' --------------------------- 0.09s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces --- 0.09s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 12372 1727204086.06658: RUNNING CLEANUP